You are Here:
Sorry Guest, you are banned from posting and sending personal messages on this forum.
Host Name Bans (EG)
This ban is not set to expire.
Everything You Know Is Wrong - I was social-engineered

Author (Read 23338 times)

0 Members and 1 Guest are viewing this topic.

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #25 on: Aug 31, 2012, 06:57:20 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!

Example 26

DNA Evidence is proof of an individuals identity. (matches? we don't need no stinkin' matches!)




DNA: GENES AS EVIDENCE
A crime lab's findings raise doubts about the reliability of genetic profiles. The bureau pushes back.
July 20, 2008|Jason Felch and Maura Dolan | Times Staff Writers

http://articles.latimes.com/2008/jul/20/local/me-dna20/4

State crime lab analyst Kathryn Troyer was running tests on Arizona's DNA database when she stumbled across two felons with remarkably similar genetic profiles. The men matched at nine of the 13 locations on chromosomes, or loci, commonly used to distinguish people. The FBI estimated the odds of unrelated people sharing those genetic markers to be as remote as 1 in 113 billion. But the mug shots of the two felons suggested that they were not related: One was black, the other white.

In the years after her 2001 discovery, Troyer found dozens of similar matches -- each seeming to defy impossible odds.

As word spread, these findings by a little-known lab worker raised questions about the accuracy of the FBI's DNA statistics and ignited a legal fight over whether the nation's genetic databases ought to be opened to wider scrutiny. The FBI laboratory, which administers the national DNA database system, tried to stop distribution of Troyer's results and began an aggressive behind-the-scenes campaign to block similar searches elsewhere, even those ordered by courts, a Times investigation found. At stake is the credibility of the compelling odds often cited in DNA cases, which can suggest an all but certain link between a suspect and a crime scene.

When DNA from such clues as blood or skin cells matches a suspect's genetic profile, it can seal his fate with a jury, even in the absence of other evidence. As questions arise about the reliability of ballistic, bite-mark and even fingerprint analysis, genetic evidence has emerged as the forensic gold standard, often portrayed in courtrooms as unassailable. But DNA "matches" are not always what they appear to be. Although a person's genetic makeup is unique, his genetic profile -- just a tiny sliver of the full genome -- may not be. Siblings often share genetic markers at several locations, and even unrelated people can share some by coincidence.

No one knows precisely how rare DNA profiles are. The odds presented in court are the FBI's best estimates.

The Arizona search was, in effect, the first test of those estimates in a large state database, and the results were surprising, even to some experts.


Lawyers seek searches

Defense attorneys seized on the Arizona discoveries as evidence that genetic profiles match more often than the official statistics imply -- and are far from unique, as the FBI has sometimes suggested. Now, lawyers around the country are asking for searches of their own state databases. Several scientists and legal experts want to test the accuracy of official statistics using the 6 million profiles in CODIS, the national system that includes most state and local databases.

"DNA is terrific and nobody doubts it, but because it is so powerful, any chinks in its armor ought to be made as salient and clear as possible so jurors will not be overwhelmed by the seeming certainty of it," said David Faigman, a professor at UC Hastings College of the Law, who specializes in scientific evidence.

FBI officials argue that, under their interpretation of federal law, use of CODIS is limited to criminal justice agencies. In their view, defense attorneys are allowed access to information about their specific cases, not the databases in general.

Bureau officials say critics have exaggerated or misunderstood the implications of Troyer's discoveries. Indeed, experts generally agree that most -- but not all -- of the Arizona matches were to be expected statistically because of the unusual way Troyer searched for them. In a typical criminal case, investigators look for matches to a specific profile. But the Arizona search looked for any matches among all the thousands of profiles in the database, greatly increasing the odds of finding them.

As a result, Thomas Callaghan, head of the FBI's CODIS unit, has dismissed Troyer's findings as "misleading" and "meaningless."

He urged authorities in several states to object to Arizona-style searches, advising them to tell courts that the probes could violate the privacy of convicted offenders, tie up crucial databases and even lead the FBI to expel offending states from CODIS -- a penalty that could cripple states' ability to solve crimes.

In one case, Callaghan advised state officials to raise the risk of expulsion with a judge but told the officials that expulsion was unlikely to actually happen, according to a record of the conversation filed in court. In an interview with The Times, Callaghan denied any effort to mislead the court. The FBI's arguments have persuaded courts in California and other states to block the searches. But in at least two states, judges overruled the objections. The resulting searches found nearly 1,000 more pairs that matched at nine or more loci. "I can appreciate why the FBI is worried about this," said David Kaye, an expert on science and the law at Arizona State University and former member of a national committee that studied forensic DNA. But "people's lives do ride on this evidence," he said. "It has got to be explained."


Concerned about errors

From her first discovery in 2001, Troyer and her colleagues in the Arizona Department of Public Safety's Phoenix DNA lab were intrigued.

At the time, many states looked at only nine or fewer loci when searching for suspects. (States now commonly attempt to compare 13 loci, and they may be able to search for more in the future. But even now, in many cases, fewer than 13 loci are discernible from crime scene evidence because of contamination or because of degradation over time.) Based on Troyer's results, she and her colleagues believed that a nine-locus match could point investigators to the wrong person.

"We felt it was interesting and just wanted people to understand it could happen," said Troyer, who initially declined to be interviewed, then cautiously discussed her findings by telephone, with her bosses on the line. "If you're going to search at nine loci, you need to be aware of what it means," said Todd Griffith, director of the Phoenix lab. "It's not necessarily absolutely the guy."

Troyer made a simple poster for a national conference of DNA analysts. It showed photos of the white man and the younger black man next to their remarkably similar genetic profiles. Some who saw the poster said they had seen similar matches in their own labs. Bruce Budowle, an FBI scientist who specializes in forensic DNA, told colleagues of Troyer that such coincidental matches were to be expected. Three years later, Bicka Barlow, a San Francisco defense attorney, came across a description of Troyer's poster on the Internet.

Its implications became clear as she prepared to defend a client accused of a 20-year-old rape and murder. A database search had found a nine-locus match between his DNA profile and semen found in the victim's body. Based on FBI estimates, the prosecutor said the odds of a coincidental match were as remote as 1 in 108 trillion.

Recalling the Arizona discovery, Barlow wondered if there might be similar coincidental matches in California's database -- the world's third-largest, with 360,000 DNA profiles at the time. The attorney called Troyer in Phoenix to learn more. Troyer seemed eager to talk about her discovery, which still had her puzzled, Barlow recalled. The analyst told Barlow she had searched the growing Arizona database since the conference and found more pairs of profiles matching at nine and even 10 loci.

Encouraged, Barlow subpoenaed a new search of the Arizona database. Among about 65,000 felons, there were 122 pairs that matched at nine of 13 loci. Twenty pairs matched at 10 loci. One matched at 11 and one at 12, though both later proved to belong to relatives. Barlow was stunned. At the time, such matches were almost unheard of.

That same year, Fred Bieber, a Harvard professor and expert in forensic DNA, testified in an unrelated criminal case that just once had he seen a pair of profiles matching at nine of 13 markers, and they belonged to brothers. He had heard of a 10-locus match between two men, but it was the result of incest -- a man whose father was also his older brother. Indeed, since 2000, the FBI has treated certain rare DNA profiles as essentially unique -- attributable to a single individual "to a reasonable degree of scientific certainty."

Other crime labs have adopted the policy, and some no longer tell jurors there is even a possibility of a coincidental match. Soon after Barlow received the results, Callaghan, the head of the FBI's DNA database unit, reprimanded Troyer's lab in Phoenix, saying it should have sought the permission of the FBI before complying with the court's order in the San Francisco case. Asked later whether Callaghan had threatened her lab, Troyer said in court, "I wouldn't say it's been threatened, but we have been reminded."

Dwight Adams, director of the FBI lab at the time, faxed Griffith, Troyer's boss, a letter saying the Arizona state lab was "under review" for releasing the search results. "While we understand that the Arizona Department of Public Safety, acting in good faith, complied with a proper judicial court order in the release of the nine-loci search of your offender DNA records, this release of DNA data was not authorized," Adams wrote, asking Arizona to take "appropriate corrective action." Arizona officials obtained a court order to prevent Barlow from sharing the results with anyone else.

But it was too late. After a judge found the Arizona results to be irrelevant in Barlow's case, the defense attorney e-mailed them to a network of her colleagues and DNA experts around the country. Soon, defense lawyers in other states were seeking what came to be known as "Arizona searches."

'Don't panic'

For years, DNA's strength in the courtroom has been the brute power of its numbers. It's hard to argue with odds like 1 in 100 billion. Troyer's discovery threatened to turn the tables on prosecutors. At first blush, the Arizona matches appeared to contradict those statistics and the popular notion that DNA profiles, like DNA, were essentially unique. Law enforcement experts scrambled to explain.

Three months after the court-ordered search in Arizona, Steven Myers, a senior DNA analyst at the California Department of Justice, gave a presentation to the Assn. of California Crime Lab Analysts. It was titled "Don't Panic" -- a hint at the alarm Troyer's discovery had set off. Many of the Arizona matches were predictable, Myers said, given the type of search Troyer had conducted. In a database search for a criminal case, a crime scene sample would have been compared to every profile in the database -- about 65,000 comparisons. But Troyer compared all 65,000 profiles in Arizona's database to each other, resulting in about 2 billion comparisons. Each comparison made it more likely she would find a match.

When this "database effect" was considered, about 100 of the 144 matches Troyer had found were to be expected statistically, Myers found. Troyer's search also looked for matches at any of 13 genetic locations, while in a real criminal case the analyst would look for a particular profile -- making a match far less likely.

Further, any nonmatching markers would immediately rule out a suspect. In the case of the black and white men who matched at nine loci, the four loci that differed -- if available from crime scene evidence -- would have ensured that the wrong man was not implicated. The presence of relatives in the database could also account for some of Troyer's findings, the FBI and other experts say. Whether that's the case would require cumbersome research because the databases don't contain identifying information, they say.


Flaws in assumptions?

Some scientists are not satisfied by these explanations. They wonder whether Troyer's findings signal flaws in the complex assumptions that underlie the FBI's rarity estimates.

In the 1990s, FBI scientists estimated the rarity of each genetic marker by extrapolating from sample populations of a few hundred people from various ethnic or racial groups. The estimates for each marker are multiplied across all 13 loci to come up with a rarity estimate for the entire profile. These estimates make assumptions about how populations mate and whether genetic markers are independent of each other. They also don't account for relatives. Bruce Weir, a statistician at the University of Washington who has studied the issue, said these assumptions should be tested empirically in the national database system. "Instead of saying we predict there will be a match, let's open it up and look," Weir said.

Some experts predict that given the rapid growth of CODIS, such a search would produce one or more examples of unrelated people who are identical at all 13 loci. Such a discovery was once unimaginable.


'Dire consequences'

In January 2006, not long after Barlow distributed the results of the court-ordered search in Arizona, the FBI sent out a nationwide alert to crime labs warning of similar defense requests. Soon after, the bureau's arguments against the searches were being made in courtrooms around the country.

In California, Michael Chamberlain, a state Department of Justice official, persuaded judges that such a search could have "dire consequences" -- violating the privacy of convicted offenders, shutting down the database for days and risking the state's expulsion from the FBI's national DNA system. All this for a search whose results would be irrelevant and misleading to jurors, Chamberlain argued. When similar arguments were made in an Arizona case, the judge ruled that the search would be "nothing more than an interesting deep sea fishing expedition."

But in Illinois and Maryland, courts ordered the searches to proceed, despite opposition from the FBI and state officials at every turn. In July 2006, after Chicago-area defense attorneys sought a database search on behalf of a murder suspect, the FBI's Callaghan held a telephone conference with Illinois crime lab officials. The topic was "how to fight this," according to lab officials' summary of the conversation, which later became part of the court record. Callaghan suggested they tell the judge that Illinois could be disconnected from the national database system, the summary shows. Callaghan then told the lab officials "it would in fact be unlikely that IL would be disconnected," according to the summary. In an interview, Callaghan disputed he said that.

"I didn't say it was unlikely to happen," he said. "I was asked specifically, what's the likelihood here? I said, I don't know, but it takes a lot for a state to be cut off from the national database."

A week later, the judge ordered the search. Lawyers for the lab then took the matter to the Illinois Supreme Court, arguing in part that Illinois could lose its access to the federal DNA database. The high court refused to block the search.

The result: 903 pairs of profiles matching at nine or more loci in a database of about 220,000. State officials obtained a court order to prevent distribution of the results. The Times obtained them from a scientist who works closely with the FBI.


A 'unilateral decision'

A similar fight occurred in a death penalty case in Maryland during the summer and fall of 2006. The prosecutor saw a DNA match between a baseball cap dropped at the crime scene and the suspect as so definitive that he didn't plan to tell the jury about the chance of a coincidental match, records show. Seeking to cast doubt on the evidence, the defense persuaded the judge to order an "Arizona search" of the Maryland database. The state did not comply.

After the defense filed a contempt-of-court motion, Michelle Groves, the state's DNA administrator, argued in court and in an affidavit that, based on conversations with Callaghan at the FBI, she believed the request was burdensome and possibly illegal. According to Groves, Callaghan had told her that complying with the court order could lead Maryland to be disconnected from CODIS -- a result Groves' lawyer said would be "catastrophic." Groves' affidavit was edited by FBI officials and the technology contractor that designed CODIS, court records show. Before submitting the affidavit, Groves wrote the group an e-mail saying, "Let's see if this will work," the records show.

It didn't. After the judge, Steven Platt, rejected her arguments, Groves returned to court, saying the search was too risky. FBI officials had now warned her that it could corrupt the entire state database, something they would not help fix, she told the court. Platt reaffirmed his earlier order, decrying Callaghan's "unilateral" decision to block the search.

"The court will not accept the notion that the extent of a person's due process rights hinges solely on whether some employee of the FBI chooses to authorize the use of the [database] software," Platt wrote.

The search went ahead in January 2007. The system did not go down, nor was Maryland expelled from the national database system. In a database of fewer than 30,000 profiles, 32 pairs matched at nine or more loci. Three of those pairs were "perfect" matches, identical at 13 out of 13 loci. Experts say they most likely are duplicates or belong to identical twins or brothers. It's also possible that one of the matches is between unrelated people -- defying odds as remote as 1 in 1 quadrillion.

Maryland officials never did the research to find out.

Matching profiles

As databases grow, so do the chances of finding a coincidental match. Three states have searched their DNA databases for pairs of profiles that have nine or more genetic markers in common. The more profiles in the database, the more matches were found.

Maryland: 33 matches in a database of 20,000 profiles
Arizona: 144 matches in a database of 65,000 profiles
Illinois: 903 matches in a database of 230,000 profiles


California: State database has more than 1 million profiles. Several search requests have been denied.

FBI: The national DNA database, maintained by the FBI, has almost 6 million DNA profiles. It has never been searched for such coincidental matches.


Quote
Birthday paradox

Experts use an analogy called the birthday paradox to explain that the way you search for a DNA profile can dramatically affect your chances of finding a match. In some circumstances, matches are far more likely than many people think.

Imagine you're at a party with 99 other guests. If you randomly pull aside one of them, the odds he or she will share your day and month of birth are 1 in 365.

But the probability that anyone at the party shares your birthday is far higher: about 1 in 4. When you compare your birthday with 99 other people's, each comparison makes a match more likely. (The math: Multiply the odds of 1/365 by 99, the number of comparisons, to get the approximate probability.)

For the same reason, the odds that anyone at the party shares a birthday with anyone else are higher still. In fact, it's almost a certainty. As everyone looked for a match with everyone else, they made 4,950 comparisons. (The math: Multiply 100 people by the 99 other guests they compare themselves with, then divide by two because people who compare with each other count as a single comparison.)

How many people need to be at the party for it to be likely that two guests share a birthday?

The answer may surprise you: just 23.
 

Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #26 on: Jun 28, 2013, 07:56:20 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!

Example 27

There’s no such thing as a fish: There is no direct scientific equivalent for our casual word "fish", and what most people call fish are a broad and diverse group. The word "fish" is a vague word that could mean many things; it doesn't really have a place in biological literature. We use the word ‘fish’ to refer to a number of different branches of the animal kingdom rather than the single branch that was originally intended to be known as fish, so in a way the word has lost its meaning.



We often group species together in a superficial way and it is only by looking at the genome of a species that we can begin to fully understand evolutionary lineage. Placing different plants or animals under an umbrella term can make life easier but it makes no sense on a genetic level to say that all things that live in the sea are fish, just as it makes no sense to say that all things that can fly are birds.

A lifetime study of sea creatures led the esteemed biologist and paleontologist Stephen Jay Gould to conclude that there is, in fact, no such thing as a fish. He explained that the term had no biological meaning and was an over-simplification that grouped aquatic creatures together.

As evolutionary biologist Richard Dawkins explains in his book The Greatest Show on Earth, “trout and tuna are closer cousins to humans than they are to sharks, but we call them all “fish”.”


QI XL S08E03
http://www.youtube.com/watch?v=gTHlrpMy7ps&feature=player_detailpage&list=PL91B7DE2F81280A1B#t=1263s

There’s no such thing as a fish
http://www.elements-science.co.uk/2012/05/turtle-origins-uncovered/

New Scientist 2 Jan 1986
http://books.google.com.au/books?id=HHUe3pChrPQC&lpg=PA44&ots=k24ACPpKz2&dq=%22no%20such%20thing%20as%20a%20fish%22&pg=PA44#v=onepage&q=%22no%20such%20thing%20as%20a%20fish%22&f=false
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #27 on: Jun 28, 2013, 07:59:58 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 28

Chastity belts were used in the Middle Ages to prevent the crusading knights' ladies from indulging in sexual indiscretions.



The idea of a crusader clapping his wife in a chastity belt and galloping off to war with the key round his neck is a nineteenth-century fantasy designed to titillate readers. There is very little evidence for the use of chastity belts in the Middle Ages at all. The first known drawing of one occurs in the fifteenth century.

Konrad Kyeser’s Bellifortis was a book on contemporary military equipment written long after the crusades had finished. It includes an illustration of the “hard iron breeches” worn by Florentine women. In the diagram, the key is clearly visible—which suggests that it was the lady and not the knight who controlled access to the device, to protect herself against the unwanted attentions of Florentine bucks. In museum collections, most “medieval” chastity belts have now been shown to be of dubious authenticity and removed from display.

As with “medieval” torture equipment, it appears that most of it was manufactured in Germany in the nineteenth century to satisfy the curiosity of “specialist” collectors. The nineteenth century also witnessed an upturn in sales of new chastity belts—but these were not for women.Victorian medical theory was of the opinion that masturbation was harmful to health. Boys who could not be trusted to keep their hands to themselves were forced to wear these improving steel underpants. But the real boom in sales has come in the last fifty years, as adult shops take advantage of the thriving bondage market.

There are more chastity belts around today than there ever were in the Middle Ages. Paradoxically, they exist to stimulate sex, not to prevent it.
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #28 on: Jun 28, 2013, 08:12:37 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 29

Seatbelts in automobiles save lives.



Since 1963 the federal government has spent billions of dollars to persuade, and force, the American people to wear seatbelts in automobiles. It has done this without any research, without any basis in fact, without any evidence that wearing a seatbelt improves a person’s chance of survival in an automobile accident. Indeed, research has shown that the opposite is true.1

[...]

After motorists were first required to use seatbelts, reports began to come in from emergency rooms that people were being killed by seatbelts. Instead of heeding these reports and repealing the seatbelt laws, congress put forward the theory that it was merely a mistake in seatbelt design, and ordered the addition of shoulder belts to seatbelts. This resulted in an increase in seatbelt fatalities, as motorists were now being killed by their shoulder belts as well as by their lap belts.1

[...]

When it became clear that seatbelts were not effective in preventing fatalities in head-on collisions the National Highway Traffic Safety Administration (NHTSA), in line with its continuing mandate to promote seatbelts, put forward the theory that they would prevent people from being killed in roll-overs. Quite apart from the fact that most people couldn’t roll their cars over if they tried, it turned out that most people who were killed in roll-overs were killed by being crushed when the roof caved in. The best chance of survival in such a case is to duck down, jump clear or be thrown clear, all of which are prevented by a seatbelt. The effect of seatbelts in roll-overs was thus to increase, not decrease, the number of fatalities.

The first seatbelt law was passed in the United States in 1963. This merely required that new cars made after l964 be equipped with seat belts. There was no requirement that people actually use them. When it was first suggested to Henry Ford II that he put seatbelts in new Ford cars, his response was, “That’s the craziest thing I ever heard”. During the hearings held both before and after the passage of these laws, experts from the automobile industry repeatedly warned the members of congress that putting seatbelts in cars was not a good idea. The congress chose to ignore these warnings.

The case for seatbelts in automobiles was based on five false assumptions, something congress could easily have discovered before passing this legislation if they had bothered to ask the experts or, indeed, if they had merely listened to the experts, for the experts did try to tell them. They not only did not ask, they turned a deaf ear when they were told. As a result, thousands have died.1

The five false assumptions were these:

1. Most people who are killed in automobile accidents are killed in head-on collisions. In fact, according to the government’s own data, fewer than two percent of all collisions are head-on collisions and fewer than 14% of all fatal collisions are head-on collisions.

2. People are killed in head-on collisions by being thrown through the windshield. In fact, according to the latest available government data, of the 36,281 vehicle occupants who were killed in 2001 (the last year for which the government listed head-on collisions as a separate category) only 145 were “thrown through the windshield”.

3. Vehicle occupants would be saved if they were prevented from being thrown through the windshield by wearing a seatbelt. In fact, if the force on the occupant is sufficiently great to throw him through the windshield, the injury inflicted on the wearer by the seatbelt itself would be enough to kill him.

4. The passenger compartment is never safe in fatal collisions. In fact, the overwhelming majority of motorists who are killed in fatal collisions are killed by being crushed to death when the passenger compartment is caved in. The seatbelt acts like an anchor, holding the occupant in place while he is being crushed to death.

5. The seatbelt itself will not injure the wearer. In fact, in a head-on collision as low as 30 miles per hour with one foot of crush, the seatbelt will exert a force on the wearer of 30 times his body weight, i.e., enough to kill him. The fact that the seatbelt itself might injure the wearer never occurred to them. The seatbelt proponents had never heard of Newton's second law of motion!

The key question, what is the effect of seatbelts on people in other types of accidents, was not considered.1

Seat-belt laws have also failed to reduce highway fatalities in the numbers promised by supporters to get such laws passed. According to the National Highway Traffic Safety Administration, there were 51,093 highway fatalities in 1979.9 Five years later, 1984, the year before seat-belt laws began to pass, there were 44,257 fatalities. That is a net decrease of 6,836 deaths in five years, which represents a 13.4 percent decline with no seat-belt laws and only voluntary seat-belt use. In 1999, there were 41,611 fatalities. That is a net decrease of 2,646 deaths, a 6 percent decrease over 15 years of rigid seat-belt law enforcement, with some states claiming 80 percent seat-belt use. If the passage of seat-belt laws did anything, it slowed the downward trend in highway fatalities started years before the passage of such laws.2

Fatalities per year comparisons. Figure 15-1 shows the change in the simplest measure of safety performance, total traffic deaths per year. While fatalities in the 23 year period declined in the US by 16.2%, declines of 46.0%, 49.9%, and 51.1% occurred in Britain, Canada, and Australia (Table 15-1). In the prior 1960-1978 period the comparison countries did not systematically outperform the US. On the contrary, fatalities in Canada and Australia increased by 65% and 50% (compared to a 38% increase in the US), but in GB decreased by 2%.

The number of traffic deaths that would have occurred in the US in 2002 if US fatalities had declined by the same percents as in the comparison countries from 1979-2002 are shown in Table 15-2. If the US total had declined by 46.0%, as it did in Great Britain, then US fatalities in 2002 would have been 27,598 instead of the 42,815 that occurred. (All derivations are based on calculations including more decimal places than shown in tables). By matching the British decline, 15,217 fewer Americans would have been killed in 2002. The corresponding fatality reductions for matching Canadian and Australian performance are 17,229 and 17,837.2



ref.

1. SEATBELTS KILL - THE TRUE STORY OF THE SEATBELT SCAM
http://www.fiberpipe.net/~tiktin/Documents/seatbeltskill.htm

2. The Fraud of Seat-Belt Laws
http://fee.org/the_freeman/detail/the-fraud-of-seat-belt-laws/#ixzz2CPyvt7WV

3. Traffic Safety by Leonard Evans. Chapter 15 "The dramatic failure of US safety policy"
http://scienceservingsociety.com/ts/text/ch15.htm
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #29 on: Jun 28, 2013, 08:18:09 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 30

DDT (dichlorodiphenyltrichloroetha ne) is a poison that causes cancer and is devastating to wildlife.

Michael Crichton on DDT
http://www.youtube.com/watch?v=aSYla0y9Wcs



        "In fact, DDT prevents cancer. "DDT in the diet has repeatedly been shown to enhance the production of hepatic enzymes in mammals and birds.
        Those enzymes inhibit tumors and cancers in humans as well as wildlife."

        "The search for an effective substitute for DDT continues to fail 30 years after the Ruckelshaus ban.
        The search for a treatment for malaria continues to fail; the mutations of the malaria virus soon make a drug ineffective. The search for a malaria-vaccine continues to fail."


The chemical compound that has saved more human lives than any other in history, DDT, was banned by order of one man, the head of the U.S. Environmental Protection Agency (EPA). Public pressure was generated by one popular book and sustained by faulty or fraudulent research. Widely believed claims of carcinogenicity, toxicity to birds, anti-androgenic properties, and prolonged environmental persistence are false or grossly exaggerated. The worldwide effect of the U.S. ban has been millions of preventable deaths.

In World War I, prior to the discovery of the insecticidal potential of DDT, typhus killed more servicemen than bullets. In World War II, typhus was no problem. The world has marveled at the effectiveness of DDT in fighting malaria, yellow fever, dengue, sleeping sickness, plague, encephalitis, West Nile Virus, and other diseases transmitted by mosquitoes, fleas, and lice.

Today, the greatest killer and disabler is malaria, which kills a person every 30 seconds. By the 1960s, DDT had brought malaria near to extinction. "To only a few chemicals does man owe as great a debt as to DDT. In little more than two decades, DDT has prevented 500 million human deaths, due to malaria, that otherwise would have been inevitable," said the National Academy of Sciences.


        Unable to find harm to human health, DDT opponents turned to bird health, alleging a decline of bald eagles and other birds of prey, which they associated with heavy DDT usage.
        Rachel Carson led the accusation. It has been repeated so often and so passionately that the public is still convinced of it.



But the handwriting was on the wall when William Ruckelshaus, administrator of the Environmental Protection Agency, in an address to the Audubon Society in Milwaukee in 1971, clearly stated his position:

"As you know, many mass uses of DDT have already been prohibited, including all uses around the home. Certainly we'll all feel better when the persistent compounds can be phased out in favor of biological controls. But awaiting this millennium does not permit the luxury of dodging the harsh decisions of today.

Rachel Carson began the countrywide assault on DDT with her 1962 book, Silent Spring. Carson made errors, some designed to scare, about DDT and synthetic pesticides. "For the first time in the history of the world, every human being is now subjected to contact with dangerous chemicals, from the moment of conception to death," she intoned.

"This is nonsense," commented pesticide specialists Bruce N. Ames and Thomas H. Jukes of the University of California at Berkeley. (Ames is a professor of biochemistry and molecular biology, world renowned. Jukes, who died a few years ago, was a professor of biophysics and a leader in the defense of DDT.) "Every chemical is dangerous if the concentration is too high. Moreover, 99.9 percent of the chemicals humans ingest are natural... produced by plants to kill off predators," Ames and Jukes wrote in Reason in 1993.

Carson, not very scrupulous, implied that the renowned Albert Schweitzer agreed with her on DDT by dedicating Silent Spring "to Dr. Albert Schweitzer, who said 'Man has lost the capacity to foresee and forestall. He will end by destroying the earth.'" Professor Edwards doubted the implication. He got a copy of Schweitzer's autobiography. Dr. Schweitzer was referring to atomic warfare. Professor Edwards found on page 262, "How much labor and waste of time these wicked insects do cause, but a ray of hope, in the use of DDT, is now held out to us."

But Miss Carson's skillful writing was enough to direct a new-born environmental industry looking for a hot issue into a feverish campaign against DDT. "Rachel Carson set the style for environmentalism. Exaggeration and omission of pertinent contradictory evidence are acceptable for the holy cause," wrote Professors Ames and Jukes.



ref.

DDT, Fraud, and Tragedy
http://spectator.org/archives/2005/02/25/ddt-fraud-and-tragedy

DDT: A Case Study in Scientific Fraud" by the late J. Gordon Edwards, Professor Emeritus of Entomology at San Jose State University in San Jose, California.
http://www.jpands.org/vol9no3/edwards.pdf
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #30 on: Jun 28, 2013, 08:21:28 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 31

Edison invented the light bulb



In 1840, British Astronomer and Chemist, Warren de la Rue, enclosed a platinum coil in a vacuum tube and passed an electric current through it, thus creating the world’s first light bulb – a full 40 years before Edison was issued a patent for creating it.

Actually, historians list up to 22 inventors of the incandescent lamp before Thomas Edison, starting with Sir Humphry Davy in the early 19th Century.

But in 1878, Edison challenged himself and his workers to produce a commercially viable and longer lasting light bulb, based on the work of inventors before him. In October 1879, by creating an extremely high vacuum inside a bulb and using a carbon filament, he filed a US patent for the first practical high-resistance lamp capable of burning for hundreds of hours.

So while he didn’t actually invent the lightbulb, he did produce the first version that was practical for everyday use.



http://en.wikipedia.org/wiki/Incandescent_light_bulb



some added truth...

The Phoebus cartel

VIDEO: The Lightbulb Conspiracy
http://documentaryheaven.com/the-lightbulb-conspiracy/

In the early 1900′s, the goal was to make the light bulb last as long as possible. Edison’s lamp lasted 1500 hours, and in the 1920′s, manufacturers advertised lamps sporting a 2500 hour life. Then leading lamp manufacturers came up with the idea that it might be more profitable if the bulbs were made less durable.

In 1924, the Phoebus cartel was created in order to control global lamp production, to which they tied manufacturers all over the world, dividing the various continents between them. In the documentary, historian Helmut High shows the original cartel document that states: “The average life of lamps may not be guaranteed, advertised or published as more than 1 000 hours.” The cartel pressured its members to develop a more fragile incandescent bulb, which would remain within the established 1000-hour rule. Osram tested life and all manufacturers that did not keep the lower standards were heavily fined. Bulb life was thereby reduced to the required 1000 hours.

The film claims that there are patents on incandescent light bulbs with 100 000 hours lifetime, but they never went into production – except Adolphe Chaillets bulb of Livermore Fire Department in California, which has burned continuously since 1901. In 1981, the East German company Narva created a lamp for a long life lamp and showed it at an international light fair. Nobody was interested. (It later became accepted as a special ‘long-life’ lamp but was never a commercial hit.)

Wikipedia states that the Phoebus cartel included Osram, Philips, Tungsram, Compagnie des Lampes, Associated Electrical Industries, ELIN, International General Electric, and the GE Overseas Group. “They owned shares in the Swiss corporation proportional to their lamp sales.”

    “The Phoebus Cartel divided the world’s lamp markets into three categories:

       1. home territories, the home country of individual manufacturers
       2. British overseas territories, under control of Associated Electrical Industries, Osram, Philips, and Tungsram
       3. common territory, the rest of the world

In 1921 a precursor organisation was founded by Osram, the Internationale Glühlampen Preisvereinigung. When Philips and other manufacturers were entering the American market, General Electric reacted by setting up the International General Electric Company in Paris. Both organisations were involved in trading patents and adjusting market penetration. Increasing international competition led to negotiations between all major companies to control and restrict their respective activities in order not to interfere in each other’s spheres.”

According to the documentary, the cartel officially never existed (even though their memorandum remains in archives). Their strategy has been to rename all the time, but still exists in one form or another. The film mentions The International Energy cartel, but that seems to be more about controlling world energy production rather than light bulbs specifically.
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #31 on: Jun 28, 2013, 09:56:43 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
EXAMPLE 32

You are living in the past



http://www.youtube.com/watch?v=BTOODPf-iuc

We persistently have the illusion that it is the present. Just like we have the illusion of a single vision instead of 32, or of one body instead of 7. We are living approximations, and every single thing we do or sense is an approximation. It's a wonderful illusion though.


LINKS:

Watch David Eagleman discuss time: http://www.youtube.com/watch?v=MkANniH8XZE

Flash Lag Effect demonstration: http://www.michaelbach.de/ot/mot_flashlag1/index.html

Previous Vsauce videos about time and perception:

Why does time feel faster as we age? http://www.youtube.com/watch?v=6LyCC6jjcx8

Video and the FRAME RATE of the eye: http://www.youtube.com/watch?v=buSaywCF6E8

Stopped Clock Illusion: http://www.youtube.com/watch?v=nNBTLbw1_2Q

Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #32 on: Jun 28, 2013, 10:04:16 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 33 (yes, yes, I know...)

The City of London is not the city named London.




The Great City of London, known for its historical landmarks, modern skyscrapers, ancient markets and famous bridges.  It's arguably the financial capital of the world and home to over eleven thousand people.


Wait, what?  Eleven... thousand?


That's right: but the City of London is a different place from London -- though London is also known for its historical landmarks, modern skyscrapers, ancient markets, famous bridges and is home to the government, royal family and seven million people.


But, if you look map of London crafted by a careful cartographer that map will have a one-square mile hole near the middle -- it's here where the City of London lives inside of the city named London.
Despite these confusingly close names the two Londons have separate city halls and elect separate mayors, who collect separate taxes to fund separate police who enforce separate laws.


The Mayor of the City of London has a fancy title 'The Right Honourable the Lord Mayor of London' to match his fancy outfit.   He also gets to ride in a golden carriage and work in a Guildhall while the mayor of London has to wear a suit, ride a bike and work in an office building.


The City of London also has its own flag and its own crest which is awesome and makes London's lack of either twice as sad.


To top it off the City of London gets to act more like one of the countries in the UK than just an oddly located city -- for uniquely the corporation that runs the city of London is older than the United Kingdom by several hundred years.


So how did the UK end up with two Londons, one inside of the other?  Because: Romans.


2,000 years ago they came to Great Britain, killed a bunch of druids, and founded a trading post on the River Thames and named it Londonimium. Being Romans they got to work doing what Romans do: enforcing laws, increasing trade, building temples, public baths, roads, bridges and a wall to defend their work.


And it's this wall which is why the current City of London exists -- for though the Romans came and the Romans went and kingdoms rose and kingdoms fell, the wall endured protecting the city within.  And The City, governing itself and trading with the world, grew rich.


A thousand years after the Romans (yet still a thousand years ago) when William the Conqueror came to Great Britain to conqueror everything and begin modern british history he found the City of London, with its sturdy walls more challenging to defeat than farmers on open fields.


So he agreed to recognize the rights and privileges City of Londoners were used to in return for the them recognizing him as the new King.


Though after the negotiation, William quickly built towers around the City of London which were just as much about protecting William from the locals within as defending against the Vikings from without.


This started a thousand-year long tradition whereby Monarchs always reconfirmed that 'yes' the City of London is a special, unique place best left to its own business, while simultaneously distrusting it.


Many a monarch thought the City of London was too powerful and rich.  And one even built a new Capital city nearby, named Westminster, to compete with the City of London and hopefully, suck power and wealth away from it.  This was the start of the second London.


As the centuries passed, Westminster grew and merged with nearby towns eventually surrounding the walled-in, and still separate City of London.  But, people began to call the whole urban collection 'London' and the name became official when Parliament joined towns together under a single municipal government with a mayor.


But, the mayor of London still doesn't have power over the tiny City of London which has rules and traditions like nowhere else in the country and possibly the world.


For example, the ruling monarch doesn't just enter the City of London on a whim, but instead asks for permission from the Lord Mayor at a ceremony. While it's not required by law, the ceremony is, unusual to say the least.


The City of London also has a representative in Parliament, The Remembrancer, whose job it is to protects the City's special rights.


Because of this, laws passed by Parliament sometimes don't apply to the City of London: most notably voting reforms, which we'll discuss next time.  But if you're curious, unlike anywhere else in the UK elections in the City of London involve Medieval Guilds and modern companies.


Finally, the City of London also owns and operates land and buildings far outside its border, making it quite wealthy.


Once you start looking for The City's Crest you'll find it in lots of places, but most notably on Tower Bridge which, while being in London is operated by City of London,
These crests everywhere when combined with the City of London's age and wealth and quazi-independent status make it an irresistible temptation for conspiracy nuts.  Add in the oldest Masonic temple and it's not long before the crazy part of the Internet yelling about secret societies controlling the world via the finance industry from inside the City-state of London.  (And don't forget the reptilian alien Queen who's really behind it all.)


But conspiracy theories aside, the City of London is not an independent nation like the Vatican is, no matter how much you might read it on the Internet, rather it's a unique place in the United Kingdom with a long and complicated history.


The wall that began all this 2,000 years ago is now mostly gone -- so the border between London and its secret inner city isn't so obvious.   Though, next time you're in London, if you come across a small dragon on the street, he still guards the entrance to the city in a city in a country in a country.


The (Secret) City of London, Part 1: History

http://www.youtube.com/watch?v=LrObZ_HZZUc

The (Secret) City of London, Part 2: Government
http://www.youtube.com/watch?v=z1ROpIKZe-c
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #33 on: Jun 28, 2013, 10:20:46 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 34

The Left-Brain Right-Brain Myth



Everyone knows the popular myths about the two brain hemispheres: The right brain is artistic, musical, spatial, intuitive, and holistic; the left brain is linear, rational, analytical, and linguistic. There is some truth in these labels. But, not surprisingly, they are mostly oversimplifications of tendencies, not fixed rules.

When asked to address some of the conceptions about hemispheric differences, split-brain expert and the author of The Lopsided Ape, Michael Corballis, his assessments were very much along these lines.

On the subject of creativity and language-two skills often polarized as examples of right and left brain thinking-Corballis said, “I don’t see any good evidence that the right hemisphere is more creative than the left. Language itself is highly creative-every sentence you construct is a new creation-and one could make a case for supposing that the left hemisphere is really the creative one.” He goes on, “But I think artistic creativity is likely to invoke more right-hemisphere capacities, simply because of the right- hemisphere bias for spatial skills. And there are aspects of language, such as prosody, and perhaps pragmatic aspects such as an understanding of metaphor or sarcasm, that may be more right than left hemispheric. So it’s always a question of balance.”

“Quite simply,” writes Michael Gazzaniga, a former student of split-brain pioneer Roger Sperry, “all brains are not organized the same way.” Like with everything else human, genes collide with environment and the result is not a predictable thing.

In short, our reduction of the sides of the brain to the seats of this or that skill or quality misses the point entirely. “On the whole,” said Corballis, “I think it would be better for educationalists and therapists to forget about the hemispheres and concentrate on the skills themselves. The hemispheres are convenient pegs on which to hang our prejudices.”


Ref.

Why the Left-Brain Right-Brain Myth Will Probably Never Die
The myth has become a powerful metaphor, but it's one we should challenge
Published on June 27, 2012 by Christian Jarrett, Ph.D in Brain Myths
http://www.psychologytoday.com/blog/brain-myths/201206/why-the-left-brain-right-brain-myth-will-probably-never-die


Psychology for Designers - Left Brain / Right Brain Myth
http://psychologyfordesigners.com/post/38377562028/left-brain-right-brain-myth


“Left Brain” “Right Brain”: The Mind in Two
Gerald Gabriel | July 27, 2008
http://brainconnection.positscience.com/left-brain-right-brain-the-mind-in-two/

When I asked split-brain expert and the author of The Lopsided Ape, Michael Corballis, to address some of the conceptions about hemispheric differences, his assessments were very much along these lines.

On the subject of creativity and language-two skills often polarized as examples of right and left brain thinking-Corballis said, “I don’t see any good evidence that the right hemisphere is more creative than the left. Language itself is highly creative-every sentence you construct is a new creation-and one could make a case for supposing that the left hemisphere is really the creative one.” He goes on, “But I think artistic creativity is likely to invoke more right-hemisphere capacities, simply because of the right- hemisphere bias for spatial skills. And there are aspects of language, such as prosody, and perhaps pragmatic aspects such as an understanding of metaphor or sarcasm, that may be more right than left hemispheric. So it’s always a question of balance.”

“Quite simply,” writes Michael Gazzaniga, a former student of split-brain pioneer Roger Sperry, “all brains are not organized the same way.” Like with everything else human, genes collide with environment and the result is not a predictable thing.

In short, our reduction of the sides of the brain to the seats of this or that skill or quality misses the point entirely. “On the whole,” said Corballis, “I think it would be better for educationalists and therapists to forget about the hemispheres and concentrate on the skills themselves. The hemispheres are convenient pegs on which to hang our prejudices.”
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #34 on: Jun 28, 2013, 10:35:40 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 35

Placebo Buttons



http://www.youtube.com/watch?v=BBofF7Bwrvo

Our lives are filled with lying buttons! We're talking those pesky crosswalk buttons, close door elevator buttons and more! Why are they around, and why do we always hit them expecting magic? Anthony explores this hidden world of lies and deceit!



The Door Close button is there mostly to give passengers the illusion of control. In elevators built since the early '90s. The button is only enabled in emergency situations with a key held by an authority.


Quote
According to a 2008 article in the New Yorker, close buttons don’t close the elevator doors in many elevators built in the United States since the 1990s. In some elevators the button is there for workers and emergency personnel to use, and it only works with a key. The key-only settings isn’t always active though, as the blog Design with Intent asserts. Each elevator is different. In some, the emergency function requires a long-press of several seconds longer than the average user attempts.
http://youarenotsosmart.com/2010/02/10/placebo-buttons/

Quote
Non-functioning mechanisms like this that motivate you to fool yourself are called placebo buttons, and they’re everywhere.

Computers and timers now control the lights at many intersections, but at one time little buttons at crosswalks allowed people to trigger the signal change. Those buttons are mostly all disabled now, but the task of replacing or removing all of them was so great most cities just left them up. You still press them though, because the light eventually changes.

In an investigation by ABC news in 2010, only one functioning crosswalk button could be found in Austin, Texas; Gainsville, Fla.; and Syracuse, NY.
http://youarenotsosmart.com/2010/02/10/placebo-buttons/
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #35 on: Nov 30, 2013, 03:32:49 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!

Example 36

Laws vs Rules vs Theories

The Variability of Fundamental Constants


 Do physical constants fluctuate?

As the name implies, the so-called physical constants are supposed to be changeless. They are believed to reflect an underlying constancy of nature. In this chapter I discuss how the values of the fundamental physical constants have in fact changed over the last few decades, and suggest how the nature of these changes can be investigated further.


 There are many constants listed in handbooks of physics and chemistry, such as melting points and boiling points of thousands of chemicals, going on for hundreds of pages: for instance the boiling point of ethyl alcohol is 78.5°C at standard temperature and pressure; its freezing point is -117.3°C. But some constants are more fundamental than others. The following list gives the seven most generally regarded as truly fundamental.



The Fundamental Constants
Fundamental quantitySymbol
Velocity of lightc
Elementary chargee
Mass of the electronme
Mass of the protonmp
Avogadro constantNA
Planck's constanth
Universal gravitational constantG
Boltzmann's constantk



All these constants are expressed in terms of units; for example, the velocity of light is expressed in terms of meters per second. If the units change, so will the constants. And units are [arbitrary], dependent on definitions that may change from time to time: the meter, for instance, was originally defined in 1790 by a decree of the French National Assembly as one ten-millionth of the quadrant of the earth's meridian passing through Paris. The entire metric system was based upon the meter and imposed by law. But the original measurements of the earth's circumference were found to be in error. The meter was then defined, in 1799, in terms of a standard bar kept in France under official supervision. In 1960 the meter was redefined in terms of the wavelength of light emitted by krypton atoms; and in 1983 it was redefined again in terms of the speed of light itself, as the length of the path traveled by light in 1/299,792,458 of a second.

 
As well as any changes due to changing units, the official values of the fundamental constants vary from time to time as new measurements are made. They are continually adjusted by experts and international commissions. Old values are replaced by new ones, based on the latest 'best values' obtained in laboratories around the world. Below, I consider four examples: the gravitational constant (G>); the speed of light; Planck's constant; and also the fine structure constant a, which is derived from the charge on the electron, the velocity of light, and Planck's constant.

 
The 'best' values are already the result of considerable selection. First, experimenters tend to reject unexpected data on the grounds that they must be errors. Second, after the most deviant measurements have been weeded out, variations within a given laboratory are smoothed out by averaging the values obtained at different times, and the final value is then subjected to a series of somewhat arbitrary corrections. Finally, the results from different laboratories around the world are selected, adjusted, and averaged to arrive at the latest official value.
 
Faith in eternal truths
 In practice, then, the values of the constants change. But in theory they are supposed to be changeless. The conflict between theory and empirical reality is usually brushed aside without discussion, because all variations are assumed to be due to experimental errors, and the latest values are assumed to be the best.

 
But what if the constants really change? What if the underlying nature of nature changes? Before this subject can even be discussed, it is necessary to think about one of the most fundamental assumptions of science as we know it: faith in the uniformity of nature. For the committed believer, these questions are nonsensical. Constants must be constant.

 
Most constants have been measured only in this small region of the universe for a few decades, and the actual measurements have varied erratically. The idea that all constants are the same everywhere and always is not an extrapolations from the data. If it were an extrapolation it would be outrageous. The values of the constants as actually measured on earth have changed considerably over the last fifty years. To assume they had not changed for fifteen billion years anywhere in the universe goes far beyond the meager evidence. The fact that this assumption is so little questioned, so readily taken for granted, shows the strength of scientific faith in eternal truths.

 
According to the traditional creed of science, everything is governed by fixed laws and eternal constants. The laws of nature are the same in all times and at all places. In fact they transcend space and time. They are more like eternal Ideas--in the sense of Platonic philosophy--than evolving things. They are not made of matter, energy, fields, space, or time; they are not made of anything. In short, they are immaterial and non-physical. Like Platonic Ideas they underlie all phenomena as their hidden reason or logos, transcending space and time.

 
Of course, everyone agrees that the laws of nature as formulated by scientists change from time to time, as old theories are partially or completely superseded by new ones. For example, Newton's theory of gravitation, depending on forces acting at a distance in absolute time and space, was replaced by Einstein's theory of the gravitational field consisting of curvatures of space-time itself. But both Newton and Einstein shared the Platonic faith that underlying the changing theories of natural science there are true eternal laws, universal and immutable. And neither challenged the constancy of constants: indeed both gave great prestige to this assumption, Newton through his introduction of the universal gravitational constant, and Einstein through treating the speed of light as absolute. In modern relativity theory, c is a mathematical constant, a parameter relating the units used for time to the units used for space; its value is fixed by definition. The question as to whether the speed of light actually differs from c, although theoretically conceivable, seems of peripheral interest.

 
For the founding fathers of modern science, such as Copernicus, Kepler, Galileo, Descartes, and Newton, the laws of nature were changeless Ideas in the divine mind. God was a mathematician. The discovery of the mathematical laws of nature was a direct insight into the eternal Mind of God. Similar sentiments have been echoes by physicists ever since.

 
Until the 1960s, the universe of orthodox physics was still eternal. But evidence for the expansion of the universe has been accumulating for several decades, and the discovery of the cosmic microwave background radiation in 1965 finally triggered off a great cosmological revolution. The Big Bang theory took over. Instead of an eternal machine-like universe, gradually running down toward thermodynamic heat death, the picture was now one of a growing, developing, evolutionary cosmos. And if there was a birth of the cosmos, an initial 'singularity', as physicists put it, then once again age-old questions arise. Where and what did everything come from? Why is the universe as it is? In addition, a new question arises. If all nature evolves, why should the laws of nature not evolve as well? If laws are immanent in evolving nature, then the laws should evolve too.

 
Today these questions are usually discussed in terms of the anthropic cosmological principle, as follows: Out of the many possible universes, only one with the constants set at the values found today could have given rise to a world with life as we know it and allowed the emergence of intelligent cosmologists capable of discussing it. If the values of the constants had been different, there would have been no stars, nor atoms, nor planets, nor people. Even if the constants were only slightly different, we would not be here. For example, with just a small change in the relative strengths of the nuclear and electromagnetic forces there could be no carbon atoms, and hence no carbon-based forms of life such as ourselves. 'The Holy Grail of modern physics is to explain why these numerical constants . . . have the particular numerical values they do.'

 
Some physicists incline toward a kind of neo-Deism, with a mathematical creator-God who fine-tuned the constants in the first place, selecting from many possible universes the one in which we can evolve. Others prefer to leave God out of it. One way of avoiding the need for a mathematical mind to fix the constants of nature is to suppose that our universe arose from a froth of possible universes. The primordial bubble that gave rise to our universe was one of many. But our universe has to have the constants it does by the very fact we are here. Somehow our presence imposes a selection. There may be innumerable alien and uninhabitable universes quite unknown to us, but this is the ony one we can know.

 
This kind of speculation has been carried even further by Lee Smolin, who has proposed a kind of cosmic Darwinism. Through black holes, baby universes may be budded off from pre-existing ones and take on a life of their own. Some of these might have slight mutations in the values of their constants and hence evolve differently. Only those that form stars can form black holes and hence have babies. So by a principle of cosmic fecundity, only universes like ours would reproduce, and there may be many more or less similar habitable universes. But this very speculative theory still does not explain why any universes should exist in the first place, nor what determines the laws that govern them, nor what maintains, carries, or remembers the mutant constants in any particular universe.

 
Notice that all these metaphysical speculations, extravagant though they seem, are thoroughly conventional in that they take for granted both eternal laws and constant constants, at least within a given universe. These well-established assumptions make the constancy of constants seem like an assured truth. Their changelessness is an act of faith. ...If measurements show variations in the constants, as they often do, then the variations are dismissed as experimental errors; the latest figure is the best available approximation to the 'true' value of the constant.

 
Some variations may well be due to errors, and such errors decrease as instruments and methods of measurement improve. All kinds of measurements have inherent limitations on their accuracy. But not all the variations in the measured values of the constants need necessarily be due to error, or to the limitations of the apparatus used. Some may be real. In an evolving universe, it is conceivable that the constants evolve along with nature. They might even vary cyclically, if not chaotically.
 
Theories of changing constants
 Several physicists, among them Arthur Eddington and Paul Dirac, have speculated that at least some of the 'fundamental constants' may change with time. In particular, Dirac proposed that the universal gravitational constant, G, may be decreasing with time: the gravitational force weakening as the universe expands. But those who make such speculations are usually quick to avow that they are not challenging the idea of eternal laws; they are merely proposing that eternal laws govern the variation of the constants.

 
The proposal that the laws themselves evolve is more radical. The philosopher Alfred North Whitehead pointed out that if we drop the old idea of Platonic laws imposed on nature, and think instead of laws being immanent in nature, then they must evolve along with the nature:

 
Since the laws of nature depend on the individual characters of the things constituting nature, as the things change, then consequently the laws will change. Thus the modern evolutionary view of the physical universe should conceive of the laws of nature as evolving concurrently with the things constituting the environment. Thus the conception of the Universe as evolving subject to fixed eternal laws should be abandoned.

 
I prefer to drop the metaphor of 'law' altogether, with its outmoded image of God as a kind of law-giving emperor, as well as an omnipotent and universal law-enforcement agency. Instead, I have suggested that the regularities of nature may be more like habits. According to the hypothesis of morphic resonance, a kind of cumulative memory is inherent in nature. Rather than being governed by an eternal mathematical mind, nature is shaped by habits, subject to natural selection. And some habits are more fundamental than others; for example, the habits of hydrogen atoms are very ancient and widespread, found throughout the universe, while the habits of hyenas are not. Gravitational and electromagnetic fields, atoms, galaxies and stars are governed by archaic habits, dating back to the earliest periods in the history of the universe. From this point of view the 'fundamental constants' are quantitative aspects of deep-seated habits. They may have changed at first, but as they became increasingly fixed through repetition, the constants may have settled down to more or less stable values. In this respect the habit hypothesis agrees with the conventional assumption of constancy, though for very different reasons.

 
Even if speculations about the evolution of constants are set aside, there are at least two more reasons why constants may vary. First, they may depend on the astronomical environment, changing as the solar system moves within the galaxy, or as the galaxy moves away from other galaxies. And second, the constants may oscillate or fluctuate. They may even fluctuate in a seemingly chaotic manner. Modern chaos theory has enabled us to recognize that chaotic behavior, as opposed to old-style determinism, is normal in most realms of nature. So far the 'constants' have survived unchallenged from an earlier era of physics: the vestiges of a lingering Platonism. But what if they, too, vary chaotically?
 
The variability of the universal gravitational constant
 In spite of the central importance of the universal gravitational constant, it is the least well defined of all the fundamental constants. Attempts to pin it down to many places of decimals have failed; the measurements are just too variable. The editor of the scientific journal Nature has described as 'a blot on the face of physics' the fact that G still remains uncertain to about one part in 5,000. Indeed, in recent years the uncertainty has been so great that the existence of entirely new forces has been postulated to explain gravitational anomalies.

 
In the early 1980s, Frank Stacey and his colleagues measured G in deep mines and boreholes in Australia. Their value was about 1 percent higher than currently accepted. For example, in one set of measurements in the Hilton mine in Queensland the value of G was found to be 6.734 ± 0.002, as opposed to the currently accepted value of 6.672 ± 0.003. The Australian results were repeatable and consistent, but no one took much notice until 1986. In that year Ephrain Fischbach, at the University of Washington, Seattle, sent shock waves around the world of science by claiming that laboratory tests also showed a slight deviation from Newton's law of gravity, consistent with the Australian results. Fischbach proposed the existence of a hitherto unknown repulsive force, the so-called fifth force (the four known forces being the strong and weak nuclear forces, the electromagnetic force, and the gravitational force)

 
The possible existence of a fifth force is not particularly relevant to possible changes in G with time. But the very fact that the question of an extra force affecting gravitation could even be raised and seriously considered in the late twentieth century serves to emphasize how imprecise the characterization of gravity remains more than three centuries after the publication of Newton's Principia.

 
The suggestion by Paul Dirac and other theoretical physicists that G may be decreasing as the universe expands has been taken quite seriously by some metrologists. However, the change proposed by Dirac was very small, about 5 parts in 1011 per year. This is way below the limits of detection using conventional methods of measuring G on Earth. The 'best' results in the last twenty years differ from each other by more than 5 parts in 104. In other words, the change Dirac was suggesting is some ten million times smaller than the differences between recent 'best' values.

 
In order to test Dirac's hypothesis, a variety of indirect methods have been tried. Some depend on geological evidence, such as the slopes of fossils and dunes, from which the gravitational forces at the time they were formed can be calculated; others depend on records of eclipses over the last 3,000 years; others on modern astronomical methods.

 
The problem with all these indirect lines of evidence is that they depend on a complex tissue of theoretical assumptions, including the constancy of the other constants of nature. They are persuasive only within the framework of the present paradigm. That is to say that if one assumes the correctness of modern cosmological theories, themselves presupposing the constancy of G, the data are internally consistent, provided that all actual variations from experiment to experiment, or method to method, are assumed to be a result of error.
 
The fall in the speed of light from 1928 to 1945
 According to Einstein's theory of relativity, the speed of light in a vacuum is invariant: it is an absolute constant. Much of modern physics is based on that assumption. There is therefore a strong theoretical prejudice against raising the question of possible changes in the velocity of light. In any case, the question is now officially closed. Since 1972 the speed of light has been fixed by definition. The value is defined as 299,792.458 ± 0.001 # 2 kilometers per second.

 
As in the case of the universal gravitational constant, early measurements of c differed considerably from the present official value. For example, the determination by Römer in 1676 was about 30 percent lower, and that by Fizeau in 1849 about 5 percent higher.

 
In 1929, Birge published his review of all the evidence available up to 1927 and came to the conclusion that the best value for velocity of light was 299,796 ± 4 km/s. He pointed out that the probable error was far less than in any of the other constants, and concluded that 'the present value of c is entirely satisfactory, and can be considered as more or less permanently established.' However, even as he was writing, considerably lower values of c were being found, and by 1934 it was suggested by Gheury de Bray that the data pointed to a cyclic variation in the velocity of light.
 From around 1928 to 1945, the velocity of light appeared to be about 20 km/s lower than before and after this period. The 'best' values, found by the leading investigators using a variety of techniques, were in impressively close agreement with each other, and the available data were combined and adjusted by Birge in 1941 and Dorsey in 1945.

 
In the late 1940s the speed of light went up again. Not surprisingly, there was some turbulence at first as the old value was overthrown. The new value was about 20 km/s higher, close to that prevailing in 1927. A new consensus developed. How long this consensus would have lasted if based on continuing measurements is a matter for speculation. In practice, further disagreement was prevented by fixing the speed of light in 1972 by definition.

 
How can the lower velocity from 1928 to 1945 be explained? If it was simply a matter of experimental error, why did the results of different investigators and different methods agree so well? And why were the estimated errors so low?

 
One possibility is that the velocity of light really does fluctuate from time to time. Perhaps it really did drop for nearly twenty years. But this is not a possibility that has been seriously considered by researchers in the field, except for de Bray. So strong is the assumption that it must be fixed that the empirical data have to be explained away. This remarkable episode in the history of the speed of light is now generally attributed to the psychology of metrologists:

 
The tendency for experiments in a given epoch to agree with one another has been described by the delicate phrase 'intellectual phase locking.' Most metrologists are very conscious of the possible existence of such effects; indeed ever-helpful colleagues delight in pointing them out! . . . .Aside from the discovery of mistakes, the near completion of the experiment brings more frequent and stimulating discussion with interested colleagues and the preliminaries to writing up the work add fresh perspective. All of these circumstances combine to prevent what was intended to be 'the final result' from being so in practice, and consequently the accusation that one is most likely to stop worrying about corrections when the value is closest to other results is easy to make and difficult to refute.

 
But if changes in the values of constants in the past are attributed to the experimenters' psychology, then, as other eminent metrologists have observed, 'this raises a disconcerting question: How do we know that this psychological factor is not equally important today?' In the case of the velocity of light, however, this question is now academic. Not only is the velocity fixed by definition, but the very units in which velocity is measured, distance and time, are defined in terms of light itself.

 
The second used to be defined as 1/86,400 of a mean solar day, but it is now defined in terms of the frequency of light emitted by a particular kind of excitation of caesium-133 atoms. A second is 9,192,631,770 times the period of vibration of the light. Meanwhile, since 1983 the meter has been defined in terms of the velocity of light, itself fixed by definition.

 
As Brian Petley has pointed out, it is conceivable that:
 (i) the velocity of light might change with time, or (ii) have a directional dependence in space, or (iii) be affected by the motion of the Earth about the Sun, or motion within our galaxy or some other reference frame.

 
Nevertheless, if such changes really happened, we would be blind to them. We are now shut up within an artificial system where such changes are not only impossible by definition, but would be undetectable in practice because of the way the units are defined. Any change in the speed of light would change the units themselves in such a way that the velocity in kilometers per second remained exactly the same.
 
The rise of Planck's constant
 Planck's constant, h, is a fundamental feature of quantum physics and relates the frequency of a radiation, v, with its quantum of energy, E, according to the formula E=hv. It has the dimensions of action (energy x time).

 
We are often told that quantum theory is brilliantly successful and amazingly accurate. For example: 'The laws that have been found to describe the quantum world. . . are the most accurate and precise tools we have ever found for the successful description and prediction of the workings of Nature. In some cases the agreement between the theory's predictions and what we measure are good to better than one part in a billion.'

 
I heard and read such statements so often that I used to assume that Planck's constant must be known with tremendous accuracy to many places of decimals. This seems to be the case if one looks it up in a scientific handbook--so long as one does not also look at previous editions. In fact its official value has changed over the years, showing a marked tendency to increase.

 
The biggest change occurred between 1929 and 1941, when it went up by more than 1 percent. This increase was largely due to a substantial change in the value of the charge on the electron, e. Experimental measurements of Planck's constant do not give direct answers, but also involve the charge on the electron and/or the mass of the electron. If either or both of these other constants change, then so does Planck's constant.


Millikan's work on the charge on the electron turned out to be one of the roots of the trouble. Even though other researchers found substantially higher values, they tended to be disregarded. 'Millikan's great renown and authority brought about the opinion that the question of the magnitude of e had practically got its definitive answer.' For some twenty years Millikan's value prevailed, but evidence went on building up that e was higher. As Richard Feynman has expressed it:

 
It's interesting to look at the history of measurements of the charge on the electron after Millikan. If you plot them as function of time, you find that one is a little bigger than Millikan's, the next one's a little bigger than that, and the next one's a little bit bigger than that, until finally they settle down to a number that is higher. Why didn't they discover that the new number was higher right away? It's a thing that scientists are ashamed of--this history--because it's apparent that people did things like this: When they got a number that was too high above Millikan's, they would look for and find a number closer to Millikan's value when they didn't look so far. And so they eliminated the numbers that were too far off, and did other things like that.

 
In the late 1930s, the discrepancies could no longer be ignored, but Millikan's high-prestige value could not simply be abandoned either; instead it was corrected by using a new value for the viscosity of air, an important variable in his oil-drop technique, bringing it into alignment with the new results. In the early 1940s, even higher values of e led to a further upward revision of the official figure. Sure enough, reasons were found to correct Millikan's value yet again, raising it to agree with the new value. Every time e increased, so Planck's constant had to be raised as well.

 
Interestingly, Planck's constant continued to creep upwards from the 1950s to the 1970s. Each of these increases exceeded the estimated error in the previously accepted value. The latest value shows a slight decline.



 
Planck's Constant from 1951 to 1988 (Review Values)
AuthorDateh(x 10-34 joule seconds)
Bearden and Watts19516.623 63 ± 0.000 16
Cohen et al.19556.625 17 ± 0.000 23
Condon19636.625 60 ± 0.000 17
Cohen and Taylor19736.626 176 ± 0.000 036
19886.626 075 5 ± 0.000 004 0
[/t][/t][/t][/t][/t][/t]

 


Several attempts have been made to look for changes in Planck's constant by studying the light from quasars and stars assumed to be very distant on the basis of the red shift in their spectra. The idea was that if Planck's constant has changed, the properties of the light emitted billions of years ago should be different from more recent light. Little difference was found, leading to the seemingly impressive conclusion that h varies by less than 5 parts in 1013 per year. But critics of such experiments have pointed out that these constancies are inevitable, since the calculations depend on the implicit assumption that h is constant; the reasoning is circular. (Strictly speaking, the starting assumption is that the product hc is constant; but since c is constant by definition, this amounts to assuming the constancy of h.)



 Fluctuations in the fine-structure constant
 One of the problems of looking for changes in a fundamental constant is that if changes are found in the constant, then it is difficult to know whether it is the constant itself that is changing, or the units in which it is measured. However, some of the constants are dimensionless, expressed as pure numbers, and hence the question of changes in units does not arise. One example is the ratio of the mass of the proton to the mass of the electron. Another is the fine-structure constant. For this reason, some metrologists have emphasized that 'secular changes in physical "constants" should be formulated in terms of such numbers.'

 
Accordingly, in this section I look at the evidence for changes in the fine-structure constant, a, formed from the charge on the electron, the velocity of light, and Planck's constant, according to the formula the fine structure constant = [charge on the electron, squared]/2 [Planck's constant][the velocity of light][the permittivity of free space]. It gives a measure of the strength of electromagnetic interactions, and is sometimes expressed as its reciprocal, approximately 1/137. This constant is treated by some theoretical physicists as one of the key cosmic numbers that a Theory of Everything should be able to explain.

 
Between 1929 and 1941 the fine-structure constant increased by about 0.2 percent, from 7.283 x 10-3 to 7.2976 x 10-3. This change was largely attributable to the increased value for the charge on the electron, partly offset by the fall in the speed of light, both of which I have already discussed. As in the case of the other constants, there was a scatter of results from different investigators, and the 'best' values were combined and adjusted from time to time by reviewers. As in the case of the other constants, the changes were generally larger than would be expected on the basis of the estimated errors. For example, the increase from 1951 to 1963 was twelve times greater than the estimated error in 1951 (expressed as the standard deviation); the increase from 1963 to 1973 was nearly five tims the estimated error in 1963.



 
The Fine-Structure Constant From 1951 to 1973
AuthorDatea x 10-3
Bearden and Watts19517.296 953 ± 0.000 028
Condon19637.297 200 ± 0.000 033
Cohen and Taylor19737.297 350 6 ± 0.000 006 0

 


Several cosmologists have speculated that the fine-structure constant might vary with the age of the universe, and attempts have been made to check this possibility by analyzing the light from stars and quasars, assuming that their distance is proportional to the red-shift of their light. The results suggest that there has been little or no change in the constant. But as with all other attempts to infer the constancy of constants from astronomical observations, many assumptions have to be made, including the constancy of other constants, the correctness of current cosmological theories, and the validity of red-shifts as indicators of distance. All of these assumptions have been and are still being questioned by dissident cosmologists.
 
Do constants really change?
 As we have seen with the four examples above, the empirical data from laboratory experiments reveal all sorts of variations as time goes on. Similar variations are found in the values of the other fundamental constants. These do not trouble true believers in constancy, because they can always be explained in terms of experimental error of one kind or another. Because of continual improvements in techniques, the greatest faith is always placed in the latest measurements, and if they differ from previous ones, the older ones are automatically discredited (except when the older ones are endowed with a high prestige, as in the case of Millikan's measurement of e). Also, at any given time, there is a tendency for metrologists to overestimate the accuracy of contemporary measurements, as shown by the way that later measurements often differ from earlier ones by amounts greater than the estimated error. Alternatively, if metrologists are estimating their errors correctly, then the changes in the values of the constants show that the constants really are fluctuating. The clearest example is the fall in the speed of light from 1928 to 1945. Was there a real change in the course of nature, or was it due to a collective delusion among metrologists?

 
So far there have been only two main theories about the fundamental constants. First, they are truly constant, and all variations in the empirical data are due to errors of one kind or another. As science progresses, these errors are reduced. With ever-increasing precision we come closer and closer to the constants' true values. This is the conventional view. Second, several theoretical physicists have speculated that one or more of the constants may vary in some smooth and regular manner with the age of the universe, or over astronomical distances. Various tests of these ideas using astronomical observations seem to have ruled out such changes. But these tests beg the question. They are founded on the assumptions that they set out to prove: that constants are constant, and that present-day cosmology is correct in all essentials.

 
There has been little consideration of the third possibility, which is the one I am exploring here, namely the possibility that constants may fluctuate, within limits, around average values which themselves remain fairly constant. The idea of changeless laws and constants is the last survivor from the era of classical physics in which a regular and (in principle) totally predictable mathematical order was supposed to prevail at all times and in all places. In reality, we find nothing in the kind in the course of human affairs, in the biological realm, in the weather, or even in the heavens. The chaos revolution has revealed that this perfect order was a beguiling illusion. Most of the natural world is inherently chaotic.

 
The fluctuating values of the fundamental constants in experimental measurements seem just as compatible with small but real changes in their values, as they are with a perfect constancy obscured by experimental errors. I now propose a simple way of distinguishing between these possibilities. I concentrate on the gravitational constant, because this is the most variable. But the same principles could be applied to any of the other constants too.

An experiment to detect possible fluctuations in the universal gravitational constant
 The principle is simple. At present, when measurements are made in a particular laboratory, the final value is based on an average of a series of individual measurements, and any unexplained variations between these measurements are attributed to random errors. Clearly, if there were real underlying fluctuations, either owing to changes in the earth's environment or to inherently chaotic fluctuations in the constant itself, these would be ironed out by the statistical procedures, showing up simply as random errors. As long as these measurements were confined to a single laboratory, there would be no way of distinguishing between these possibilities.

 
What I propose is a series of measurements of the universal gravitational constant to be made at regular intervals--say monthly--at several different laboratories all over the world, using the best available methods. Then, over a period of years, these measurements would be compared. If there were underlying fluctuations in the value of G, for whatever reason, these would show up at the various locations. In other words, the 'errors' might show a correlation--the values might tend to be high in some months and low in others. In this way, underlying patterns of variation could be detected that could not be dismissed as random error.

 
It would then be necessary to look for other explanations that did not involve a change in G, including possible changes in the units of measurement. How these inquiries would turn out is impossible to foresee. The important thing is to start looking for correlated fluctuations. And precisely because fluctuations are being looked for, there is more chance of finding them. By contrast, the current theoretical paradigm leads to a sustained effort by everyone concerned to iron out variations, because constants are assumed to be truly constant.

 
Unlike the other experiments proposed in this book, this one would involve a fairly large-scale international effort. Even so, the budget would not need to be huge if it took place in established laboratories already equipped to make such measurements. And it is even possible that it could be done by students. Several inexpensive methods for determining G have been described, based on the classical method of Cavendish using a torsion balance, and an improved student method has recently been developed which is accurate to 0.1 percent.

 
One of the advantages of the continual improvement in precision of metrological techniques is that it should become increasingly feasible to detect small changes in the constants. For example, a far greater accuracy in measurements of G should be possible when experiments can be done in spacecraft and satellites, and appropriate techniques are already being proposed and discussed. Here is an area where a big question really would need big science.

 
But there is in fact one way that this research could be done on a very low budget to start with: by examining the existing raw data for measurements of G at various laboratories over the last few decades. This would require the cooperation of the scientists concerned, because raw data are kept in scientists' notebooks and laboratory files, and many scientists are reluctant to allow others access to these private records. But given this cooperation, there may already be enough data to look for worldwide fluctuations in the value of G.

 
The implications of fluctuating fundamental constants would be enormous. The course of nature could no longer be imagined as blandly uniform; we would recognize that there are fluctuations at the very heart of physical reality. And if different fundamental constants varied at different rates, these changes would create differing qualities of time, not unlike those envisaged by astrology, but with a more radical basis."
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #36 on: Oct 02, 2014, 05:57:24 am »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!

Example 37

Modern man lives longer - the avarage age fallacy



The idea that our ancestors routinely died young has no basis in scientific fact!





Human Lifespans Nearly Constant for 2,000 Years


August 21, 2009

The Centers for Disease Control and Prevention, often the harbinger of bad news about e. coli outbreaks and swine flu, recently had some good news: The life expectancy of Americans is higher than ever, at almost 78.

Discussions about life expectancy often involve how it has improved over time. According to the National Center for Health Statistics, life expectancy for men in 1907 was 45.6 years; by 1957 it rose to 66.4; in 2007 it reached 75.5. Unlike the most recent increase in life expectancy (which was attributable largely to a decline in half of the leading causes of death including heart disease, homicide, and influenza), the increase in life expectancy between 1907 and 2007 was largely due to a decreasing infant mortality rate, which was 9.99 percent in 1907; 2.63 percent in 1957; and 0.68 percent in 2007.

But the inclusion of infant mortality rates in calculating life expectancy creates the mistaken impression that earlier generations died at a young age; Americans were not dying en masse at the age of 46 in 1907. The fact is that the maximum human lifespan — a concept often confused with "life expectancy" — has remained more or less the same for thousands of years. The idea that our ancestors routinely died young (say, at age 40) has no basis in scientific fact.

http://www.livescience.com/10569-human-lifespans-constant-2-000-years.html
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #37 on: Oct 02, 2014, 06:13:37 am »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 37

Global Warming


Man made Global Warming or 'Climate Change' is a complete and deliberate lie.

It is impossibe to watch the below videos and maintain any belief that there is man made global warming. Do you know someone that believes in Global Warming and you want to wake up? If you can convince them to watch these interviews they will change their mind.



www.50to1.net

The 50 to 1 Project
http://www.youtube.com/watch?v=Zw5Lda06iK0

What is the TRUE cost of climate change?  Is stopping it early really the cheapest plan in the long run?  50 to 1 explores the costs of stopping climate change vs adapting to it as and if it's required, and uncovers a simple truth; it's 50 times more expensive to try and STOP climate change than it is to simply ADAPT to it as and if required.


50 to 1 Project Interviews


Full length interview with Joanne Nova
http://www.youtube.com/watch?v=pgMZegvtXB0
Topher interviews Joanne Nova, a veteran science communicator and regular commentator on the ABC and many other places. Joanne speaks of her own journey and how she went from being a ‘veteran believer’ in Global Warming to being the high-profile skeptic she is today.


Full length interview with David Evans
http://www.youtube.com/watch?v=xI3doCKhI7Q
Topher interviews David Evans, former modeler for the Australian Greenhouse Office, now prominent skeptic. He explains the reasons for his change of mind and why he’s so become so vocal on the issue.


Full length interview with Anthony Watts
http://www.youtube.com/watch?v=RiuHOzykxC0
Topher interviews Anthony Watts, former weatherman and passionate believer in global warming, now world famous skeptic responsible for the ‘surface stations’ project which has found serious issues with the global temperature measuring network, and key figure within the ‘Climategate’ scandal.


Full length interview with Christopher Essex
http://www.youtube.com/watch?v=sUYpa5UHL2I
Topher interviews Christopher Essex, Professor of Applied Mathematics, who promptly ‘flips the checker board’ with questions about the very validity of such a thing as ‘Global temperature’.


Full length interview with Donna Laframboise
http://www.youtube.com/watch?v=U5weFQYBL5w
Topher interviews Donna Laframboise, former journalist turned investigative author. Donna has critiqued the Intergovernmental Panel on Climate Change’s claims about itself, its authors and its peer review process, and found them very VERY wanting…


Full length interview with Marc Morano
http://www.youtube.com/watch?v=y_crkSnRa4o
Topher interviews Marc Morano, accused ‘criminal against humanity’ and alleged ‘central cell of the climate denial machine’ and gets an insiders look into the politics and collateral damage caused by clumsy political responses to fears about climate change.


Full length interview with Fred Singer
http://www.youtube.com/watch?v=-G9mWL4nH00
Topher interviews Fred Singer, atmospheric and space physicist and long time hero of the environmental movement, and finds out why he founded the NON Governmental Panel on Climate Change and why he’s taken a high profile stand against the Intergovernmental Panel on Climate Change.


Full length interview with Henry Ergas
http://www.youtube.com/watch?v=QtnUovGY_9Q
Topher interviews Henry Ergas, a high profile Australian economist with a lot to say about carbon taxes and emissions trading schemes, and discovers some of the underlying reasons why politicians love carbon taxes and emissions trading schemes and why these ‘markets’ always seem to fail.
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #38 on: Oct 09, 2017, 03:56:55 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 38

Modern man lives longer - the avarage age fallacy



The idea that our ancestors routinely died young has no basis in scientific fact!




Human Lifespans Nearly Constant for 2,000 Years

August 21, 2009

The Centers for Disease Control and Prevention, often the harbinger of bad news about e. coli outbreaks and swine flu, recently had some good news: The life expectancy of Americans is higher than ever, at almost 78.

Discussions about life expectancy often involve how it has improved over time. According to the National Center for Health Statistics, life expectancy for men in 1907 was 45.6 years; by 1957 it rose to 66.4; in 2007 it reached 75.5. Unlike the most recent increase in life expectancy (which was attributable largely to a decline in half of the leading causes of death including heart disease, homicide, and influenza), the increase in life expectancy between 1907 and 2007 was largely due to a decreasing infant mortality rate, which was 9.99 percent in 1907; 2.63 percent in 1957; and 0.68 percent in 2007.

But the inclusion of infant mortality rates in calculating life expectancy creates the mistaken impression that earlier generations died at a young age; Americans were not dying en masse at the age of 46 in 1907. The fact is that the maximum human lifespan — a concept often confused with "life expectancy" — has remained more or less the same for thousands of years. The idea that our ancestors routinely died young (say, at age 40) has no basis in scientific fact.

https://livescience.com/10569-human-lifespans-constant-2-000-years.html



Last Edit by Palmerston
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #39 on: Oct 09, 2017, 04:05:34 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 39

Science as a Public Good - Science as preached and practiced by the established higher educational institutions, corporate funded think tanks, and comercial businesses produces scientific data that is biased at best and deliberatly false at worst.


The questionalable credibility of the Peer Review System

Experimental error and lack of reproducibility have dogged scientific research for decades. Of even greater concern are proliferating cases of outright fraud.

Medicine and the social sciences are particularly prone to bias, because the observer (presumably a white-coated scientist) cannot so easily be completely removed from his or her subject.

Double-blind tests (where neither the tester and the subject know for sure whether the test is real or just a control) are now required for many experiments and trials in both fields.


The Myth of Science as a Public Good (by Terence Kealey)










Vice Chancellor of the University of Buckingham (Britain's only independent university), Terence Kealey is a vocal critic of government funding of science. His first book, 'The Economic Laws of Scientific Research,' argues that state funding of science is neither necessary nor beneficial, a thesis that he developed in his recently published analysis of the causes scientific progress, 'Sex, Science and Profits.' In it, he makes the stronger claim that not only is government funding not beneficial, but in fact measurably obstructs scientific progress, whilst presenting an alternative, methodologically-individualist understanding of 'invisible colleges' within which science resembles a private, not a public, good.

Recorded at Christ Church, University of Oxford, on 22nd May 2009.


Ref.

For Science's Gatekeepers, a Credibility Gap (2006)
By LAWRENCE K. ALTMAN, M.D.
https://nytimes.com/2006/05/02/health/02docs.html?pagewanted=all&_r=0
Recent disclosures of fraudulent or flawed studies in medical and scientific journals have called into question as never before the merits of their peer-review system.

Impartial judgment by the "gatekeepers" of science: fallibility and accountability in the peer review process.
Hojat M, Gonnella JS, Caelleigh AS.
https://ncbi.nlm.nih.gov/pubmed/12652170

Misconduct in science communication and the role of editors as science gatekeepers
https://vimeo.com/86692444

Science Journal Pulls 60 Papers in Peer-Review Fraud
By HENRY FOUNTAINJULY 10, 2014
https://nytimes.com/2014/07/11/science/science-journal-pulls-60-papers-in-peer-review-fraud.html

Report finds massive fraud at Dutch universities
https://nature.com/news/2011/111101/full/479015a.html

Scientific fraud, sloppy science – yes, they happen
https://theconversation.com/scientific-fraud-sloppy-science-yes-they-happen-13948



Last Edit by Palmerston
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #40 on: Oct 09, 2017, 05:16:37 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 40

The Black [Bubonic] Plague (1334 - 1348) was probably not a bacterial plague. It was most likley a hemorrhagic plague.



Researchers Argue 'Black Death' Was Due To Ebola, Not Bubonic Plague

A new book titled Biology of Plagues: Evidence from Historical Populations, argues that the "Black Death" may not have been caused by the bubonic plague, as history textbooks would suggest, but rather, an Ebola-like [hemorrhagic] virus.

The authors, Christopher Duncan and Susan Scott of the University of Liverpool, claim that the bubonic plague could not have spread across Europe at the rate in which the Black Death did.

Duncan says, "If you look at the way it spreads, it was spreading at a rate of around 30 miles in two to three days. Bubonic plague moves at a pace of around 100 yards a year."

Duncan and Scott also analyzed the symptoms described in historical texts. Autopsy reports detail the internal organs of victims having had dissolved along with the appearance of black liquid. The liquidization of internal organs is a trademark of the Ebola virus and causes its victims tremendous pain.

The oozing lymph nodes that so notoriously accompanied the Black Death could also be symptomatic of an Ebola-like virus. In both cases, hemorrhagic fevers come on fast and causes blood vessels to burst underneath the skin. This is what brings out the welts, or "buboes" as they were called during the time of the Black Death.

The authors also noted that efforts to quarantine the Black Death were successful - something that would not had been possible had the disease been transmitted by rats, as history has suggested, since rats do not observe quarantines.

But not everyone is convinced. Ann Carmichael, a historian and expert on the Black Plague says, "It is problematic to assimilate evidence over four centuries and draw conclusive theories," she says, "We must look at it on a plague-by-plague basis."

http://www.designntrend.com/articles/18109/20140818/researchers-argue-black-death-due-ebola-bubonic-plague.htm




Molecular Clues Hint at What Really Caused the Black Death

http://www.livescience.com/15937-black-death-plague-debate.html

The Black Death arrived in London in the fall of 1348, and although the worst passed in less than a year, the disease took a catastrophic toll. An emergency cemetery in East Smithfield received more than 200 bodies a day between the following February and April, in addition to bodies buried in other graveyards, according to a report from the time.

The disease that killed Londoners buried in East Smithfield and at least one of three Europeans within a few years time is commonly believed to be bubonic plague, a bacterial infection marked by painful, feverish, swollen lymph nodes, called buboes. Plague is still with us in many parts of the world, although now antibiotics can halt its course.  [Pictures of A Killer: A Plague Gallery]

But did this disease really cause the Black Death? The story behind this near-apocalypse in 14th century Europe is not clear-cut, since what we know about modern plague in many ways does not match with what we know about the Black Death. And if plague isn't responsible for the Black Death, scientists wonder what could've caused the sweeping massacre and whether that killer is still lurking somewhere.

Now, a new study using bone and teeth taken from East Smithfield adds to mounting evidence exhumed from Black Death graves and tantalizes skeptics with hints at the true nature of the disease that wiped out more than a third of Europeans 650 years ago.

This team of researchers approached the topic with open minds when they began looking for genetic evidence of the killer.

"Essentially by looking at the literature on the Black Death there were several candidates for what could have been the cause," said Sharon DeWitte, one of the researchers who is now an assistant professor of anthropology at the University of South Carolina.

Their first suspect: Yersinia pestis, the bacterium that causes modern plague, including bubonic plague.

The speed of plague



In 1894, Alexander Yersin and another scientist separately identified Y. pestis during an epidemic in Hong Kong. Years later the bacterium was given his name. Yersin also connected his discovery to the pestilence that swept Europe during the Black Death, an association that has stuck.

One problem, however, is that compared to the wildfire-like spread of the Black Death, the modern plague moves more leisurely. The modern plague pandemic began in the Yunnan Province of China in the mid-19th century, then spread to Hong Kong and then via ship, to India, where it exacted the heaviest toll, and to San Francisco in 1899, among many other places.

The disease that caused the Black Death is believed to have traveled much quicker, arriving in Europe from Asia in 1347, after the Golden Horde, a Mongol Army, catapulted plague-infected bodies into a Genoese settlement near the Black Sea. The disease traveled with the Italian traders and later appeared in Sicily, according to Samuel Cohn, a professor of medieval history at the University of Glasglow and author of "The Black Death Transformed: Disease and Culture in the Early Renaissance Europe" (Bloomsbury USA, 2003).

By about 1352, roughly five years after arriving in Europe, it had not only spread across the continent, the worst of the disease had already run its course.

This wave of devastation becomes particularly surprising considering the complicated and time-consuming process by which plague has been thought to spread. You can't catch bubonic plague from another person; instead, the process involves two classic villains: rats and fleas.

Once a flea bites a rat infected with plague, the pathogen Y. pestis grows in its gut. After about two weeks, the bacteria block the valve that opens into the flea's stomach. The starving flea then bites its host, by now probably a new, healthy rat or a person, more aggressively in an attempt to feed. All the while, the flea tries to clear out the bacterial obstruction and so regurgitates the pathogen onto the bite wounds, according to Ken Gage, chief of flea-borne disease activity with the U.S. Centers for Disease Control and Prevention.

The bulk of cases during the modern plague pandemic are believed to have been spread by rats and their fleas, according to Gage. The last rat-borne plague epidemic in the U.S. occurred in 1925; wild rodents have since become the primary source for infections. However, rat-associated outbreaks continue to occur in developing countries, according to the CDC.

Fast, furious and unfamiliar



Not only has the disease slowed down, it also seems to have become more restrained. The Black Death wiped out at least 30 percent of Europe's population at the time. But the peak of the modern pandemic, in India, killed less than 2 percent of the population, DeWitte has calculated from census data.

The list of discrepancies goes on: There is evidence the Black Death spread directly between humans — no rats and their fleas involved — and to areas where rats and their fleas didn't even live. In fact, archaeological and documentary evidence indicates rats were scarce during the mid-14th century.

What's more, bubonic plague doubters point out, deaths during the Black Death appear to have followed a different seasonal cycle than plague deaths in modern times. Some also point to discrepancies in the symptoms.

Alternative theories



With the plague's role called into question, other theories have been offered to fill the gap.

"There is a lot of evidence that suggests that Yersinia pestis may not have been the causative agent for the Black Death, and it was likely something else, and something else that is out there right now," said Brian Bossak, an environmental health scientist at Georgia Southern University.

He is among those who suspect a hemorrhagic virus — which causes bleeding and fever, like ebola — swept through 14th-century Europe. The high lethality, rapid transmission and periodic resurgences seen in the Black Death are characteristic of a virus, according to Bossak, who frames this as a question in urgent need of resolution.

"Who knows if it won't happen again," he said. "It seems like every so often some disease comes out of nowhere."

Two other proponents of the virus theory, Susan Scott and Christopher Duncan of the University of Liverpool in the United Kingdom, have pointed to a possible genetic legacy left by a viral Black Death: a mutation, known as CCR5-delta32, found among Europeans, particularly those in the north. This mutation confers resistance against HIV, another virus, but does not prevent plague. It's possible that by passing over those with this mutation, the Black Death selected for this change in the genetic code, making it more common among Europeans, they argue.

To at least some degree, an alternative form of plague, pneumonic plague, offers a solution. While bubonic is the most common form of plague, plague can also infect the lungs, causing high fever, cough, bloody sputum and chills. This infection can spread person-to-person, and without antibiotic treatment it is nearly 100 percent fatal. Outbreaks have occurred in modern times, and it can develop as a result of a bubonic infection. But, it is unclear how much of a role it played in the Black Death — some evidence suggests it is not as contagious as commonly thought.

Rats and fleas



The Black Death just doesn't appear to have behaved the way the typical, modern rat-associated plague does, according to Gage, the flea expert. Even so, he says he is convinced that bubonic plague was responsible.

A group of French researchers found another possible insect carrier for the Black Death: lice. They were able to transmit fatal plague infections from sick rabbits to healthy ones via human body lice that fed on the rabbits. Substitute humans for bunnies, and this scenario offers a simpler, more cold-climate-friendly explanation than the conventional rat-flea model.

But fleas aren't out of the picture yet. Gage and his colleagues have found that many species of flea — including the Oriental rat flea, a widespread and important spreader of plague — can begin transmitting the infection much sooner than thought, before the bacterium blocks off its stomach. This supports the idea that a species of human-inhabiting fleas, whose guts the bacterium can't block well, could have spread the infection from person to person in areas without rats, Gage said. [10 Deadly Diseases That Hopped Across Species]

Plague isn't picky about its warm-blooded victims; it can infect almost any mammal, although some, like humans, cats and rats, become severely ill when infected, according to Gage. The lack of records of massive rat die-offs during the Black Death also calls into question the role rats may have played then.

CSI: Black Death



Plague kills quickly and does not leave marks on the remains that archeologists are digging up centuries later. But in recent years scientists have begun searching for the  molecular clues in the remains of the dead, including DNA left by the killer bacterium.

While a number of studies have turned up positive results from graves believed to hold European plague victims, the results haven't always been clear-cut. For instance, a 2004 study of remains in five burial sites, including East Smithfield, was unable to find any evidence of the bacteria.

Looking for evidence of the genetic traces of a pathogen within 650-year-old bones is a challenging proposition, according to Hendrik Poinar, an evolutionary geneticist at McMaster University who worked with DeWitte, then at the University of Albany, on the most recent study. After so many years in the ground, the DNA is damaged and  present only in tiny fragments, and, what's more, each sample contains only a miniscule amount of the pathogen — the rest belongs to the person and interlopers like soil bacteria, fungi, insects, even animals.

"You have to come up with a way to pull out the things of interest," Poinar said. So, after screening to detect the presence of Yersinia pestis in the 109 samples from the cemetery in East Smithfield, his lab employed a sort of sensitive fishing technique, using tiny segments of DNA that matched up with segments from a ring of DNA, called a plasmid, found in the bacterium.

Once they had retrieved this DNA, they assembled the full plasmid and compared it with modern versions of the bug. They found this plasmid matched many of the modern versions. They also sequenced a short section of DNA from the bacterium's nucleus and revealed three small changes unseen in the modern strains.

The results prove that a variant of Yersinia pestis infected the victims of the Black Death, the authors write in a recent issue of the journal Proceedings of the National Academy of Sciences.

Same bug, different disease?



This finding comes about a year after another genetic study, led by Stephanie Haensch of Johannes Gutenberg University in Germany, found evidence of two previously unknown strains of Yersinia pestis in the remains of European victims, and hints at a solution that could allow both sides to be right.

"People have always assumed the two diseases were the same," said Cohn, the medieval historian, referring to modern plague and the Black Death. "Even if it is the same pathogen, the diseases are very different."

Bossak, who has questioned the role of plague in the Black Death, agrees.

"This new (study) seems to support these earlier claims, and reinforces the notion that what we know of the epidemiology in modern Y. pestis plague may not fit the Black Death, perhaps because these ancient strains of Y. pestis are no longer present (assuming Y. pestis was indeed the causative agent)," he wrote in an email.

However, Poinar is more cautious. Although they had hoped to find changes that explained why the pathogen might have become less aggressive over the centuries, none have turned up so far. In fact, it's too early to say the changes detected represent any significant difference between the modern and ancient versions of the bacterium, according to him.

"We need the entire genome to say anything about this," Poinar wrote in an email, "and that is for future work."



Ref.

Definition: hemorrhagic plague (The hemorrhagic form of bubonic plague.)
http://medical-dictionary.thefreedictionary.com/hemorrhagic+plague


What caused the Black Death?
Duncan CJ, Scott S.
http://www.ncbi.nlm.nih.gov/pubmed/15879045

New Theories Link Black Death to Ebola-Like Virus
By MARK DERR New York Times
http://www.nytimes.com/2001/10/02/science/new-theories-link-black-death-to-ebola-like-virus.html

How the Black Death Worked
by Molly Edmonds
http://history.howstuffworks.com/historical-events/black-death5.htm

Did Yersinia pestis really cause Black Plague? Part 1: Objections to Y. pestis causation
http://scienceblogs.com/aetiology/2008/01/16/did-yersinia-pestis-really-cau/

On the trail of the Black Death
By Peter Lavelle
http://www.abc.net.au/science/articles/2004/01/22/2857189.htm



Last Edit by Palmerston
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #41 on: Oct 09, 2017, 05:19:27 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 41

The world map we commonly use is distorted, incorrect and 500 years old.



The Mercator projection (above in blue) is a cylindrical map projection presented by the Flemish geographer and cartographer Gerardus Mercator in 1569. It became the standard map projection for nautical purposes because of its ability to represent lines of constant course, known as rhumb lines or loxodromes, as straight segments which conserve the angles with the meridians. While the linear scale is equal in all directions around any point, thus preserving the angles and the shapes of small objects (which makes the projection conformal), the Mercator projection distorts the size and shape of large objects, as the scale increases from the Equator to the poles, where it becomes infinite.

VIDEO - http://youtu.be/vVX-PrBRtTY










The Gall–Peters projection (above in green), named after James Gall and Arno Peters, is one specialization of a configurable equal-area map projection known as the equal-area cylindric or cylindrical equal-area projection. It achieved considerable notoriety in the late 20th century as the centerpiece of a controversy surrounding the political implications of map design.



The Peters Projection World Map is one of the most stimulating, and controversial, images of the world. When this map was first introduced by historian and cartographer Dr. Arno Peters at a Press Conference in Germany in 1974 it generated a firestorm of debate. The first English-version of the map was published in 1983, and it continues to have passionate fans as well as staunch detractors.

The earth is round. The challenge of any world map is to represent a round earth on a flat surface. There are literally thousands of map projections. Each has certain strengths and corresponding weaknesses. Choosing among them is an exercise in values clarification: you have to decide what's important to you. That is generally determined by the way you intend to use the map. The Peters Projection is an area accurate map.

Maps based on the projection are promoted by UNESCO, and they are also widely used by British schools.









FACTS:

Mexico - Larger than Alaska by 100,000 square miles
Africa - 14 times larger than Greenland
South America - double the size of Europe
Germany - Is located in the northern most quarter of the earth


Ref.

We have been misled by a flawed world map for 500 years
http://livelearnevolve.com/peters-projection-world-map/

What does the world really look like?












Last Edit by Palmerston
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #42 on: Oct 09, 2017, 05:21:27 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 42

Myth: There are fewer slaves now than in the past.

Truth: There Are More Slaves Today Than at Any Time in Human History.



Although slavery is illegal in every country in the modern world, it still exists, and even on the narrowest definition of slavery it's likely that there are far more slaves now than there were victims of the Atlantic slave trade.

The last country to abolish slavery was the African state of Mauritania, where a 1981 presidential decree abolished the practice; however, no criminal laws were passed to enforce the ban. In August 2007 Mauritania's parliament passed legislation making the practice of slavery punishable by up to 10 years in prison.

One hundred forty-three years after passage of the 13th Amendment to the U.S. Constitution and 60 years after Article 4 of the U.N.'s Universal Declaration of Human Rights banned slavery and the slave trade worldwide, there are more slaves than at any time in human history...27 million!

Today’s slavery focuses on big profits and cheap lives. It is not about owning people like before, but about using them as completely disposable tools for making money.

During the four years that Benjamin Skinner researched modern-day slavery, he posed as a buyer at illegal brothels on several continents, interviewed convicted human traffickers in a Romanian prison and endured giardia, malaria, dengue and a bad motorcycle accident.

But Skinner is most haunted by his experience in a brothel in Bucharest, Romania, where he was offered a young woman with Down syndrome in exchange for a used car.

The institution of slavery is as old as civilization. Many nations and empires were built by the toil and suffering of slaves.

But what kinds of people were enslaved, and why? In ancient civilizations, slaves were usually war captives. The victors in battle might enslave the losers rather than killing them. Over time, people have found other reasons to justify slavery. Slaves were usually considered somehow different than their owners. They might belong to a different race, religion, nationality, or ethnic background. By focusing on such differences, slave owners felt they could deny basic human rights to their slaves.

And despite many efforts to end slavery, it still exists today. Some 27 million people worldwide are enslaved or work as forced laborers. That's more people than at any other point in the history of the world.

Cheap, Disposable People

An average slave in the American South in 1850 cost the equivalent of $40,000 in today’s money; today a slave costs an average of $90.

In 1850 it was difficult to capture a slave and then transport them to the US. Today, millions of economically and socially vulnerable people around the world are potential slaves.

This “supply” makes slaves today cheaper than they have ever been. Since they are so cheap, slaves are today are not considered a major investment worth maintaining. If slaves get sick, are injured, outlive their usefulness, or become troublesome to the slaveholder, they are dumped or killed. For most slave holders, actually legally ‘owning’ the slave is an inconvenience since they already exert total control over the individuals labor and profits. Who needs a legal document that could at some point be used against the slave holder? Today the slave holder cares more about these high profits than whether the holder and slave are of different ethnic backgrounds; in New Slavery, profit trumps skin color. Finally, new slavery is directly connected to the global economy. As in the past, most slaves are forced to work in agriculture, mining, and prostitution. From these sectors, their exploited labor flows into the global economy, and into our lives.


Ref.

BBC Modern slavery
http://www.bbc.co.uk/ethics/slavery/modern/modern_1.shtml

There Are More Slaves Today Than at Any Time in Human History
http://www.alternet.org/civil-liberties/there-are-more-slaves-today-any-time-human-history

Understanding Slavery
http://school.discoveryeducation.com/schooladventures/slavery/world.html

SLAVERY TODAY
https://www.freetheslaves.net/page.aspx?pid=301

Modern Slavery: The Secret World of 27 Million People
Publisher: Oneworld Publications; 1 edition (June 1, 2009), ISBN-10: 185168641X, ISBN-13: 978-1851686414

Slavery Affects 27 Million Lives Today: Legal Abolition vs. Effective Emancipation
http://activehistory.ca/2011/01/slavery-affects-27-million-lives-today-legal-abolition-vs-effective-emancipation/



Last Edit by Palmerston
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #43 on: Oct 09, 2017, 05:58:06 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 43

Myth: Too much sun exposure causes skin cancer

Truth: The Sun Has Never Been Directly Linked to a single Skin Cancer Case



Whether we have fair skin or dark skin, thinking about how much sun exposure we get is a good idea because we don’t want to burn or get heat stroke. But another side effect that often is brought up with sun exposure is skin cancer. Does the sun cause skin cancer?

The sun is the giver of life on this planet, there is no doubting that. But is it dangerous to our health? What may be surprising right off the top is that up until the time of writing this the sun has never ever been linked to solely causing skin cancer. All medical and governmental reports regarding patients with skin cancer have never been linked to the sun being the main cause. There have been suggestions that it may have played a role, but results are flimsy due to the lack of evidence.

This myth was created and brought forward by the sunscreen industry, dermatologists and the cancer industry. Like with many other mainstream medical conditions, unfortunately people who do get skin cancers are often just thrown into the category of having got it from sun exposure without ever really being looked at as to why they got it. This has been an issue the medical field has not looked at greatly when it comes to various diseases because it often requires much time and testing with specific patients to really determine the causes.

We hear it everywhere that if you go out in the sun without sunscreen you have a higher risk of getting skin cancer because of harmful UV rays. To more accurately put it, you do not get skin cancer from the sun you get skin cancer because you are not properly nourished and are often exposed to other chemicals that are linked to cancer. The National Academy of Sciences published a review stating that the Omega 6:3 ratio is the key to preventing skin cancers.

“Epidemiological, experimental, and mechanistic data implicate omega-6 fat as stimulators and long-chain omega-3 fats as inhibitors of development and progression of a range of human   cancers, including melanoma.”

Interestingly, when we look at products we use daily that contain known cancer causing agents, we find that the likelihood our skin cancers are linked to these products is much higher than it being linked to the sun. However, these products are overlooked and the immediate blame is put on the sun. A problem that has become a matter of public belief based on nothing but opinion and propaganda created amongst several industries.

Here are a list of some popular products used daily that are linked to causing skin cancer:

Processed foods
Pharmaceutical drugs & prescriptions
Most Shampoos & Conditioners
Many health & beauty products (creams, lotions, makeup)
Tap water (chlorine, fluoride)
Most deodorant & cologne
Ink (computers, newspapers, magazines, fliers)
Cleaning supplies
Chemical fertilizers (used on most commercial produce)
Cigarettes
Pesticides & herbicides (found on non organic foods)
Hair dye
Oil products
Most plastics
Fluorescent light bulbs (energy efficient bulbs)

I think that we often forget to look at the obvious when we look at things in this world. When we step back and think about it, how can the one thing that gives life to everything on this planet be so harmful to you? Yet here we are as a community stating that every time we get skin cancer is most likely was caused by over sun exposure. But everyday we were using chemicals that had a much much higher chance of creating skin cancer within the body.

We have been heavily trained to use protective products on our skin when we go out in the sun. We all know these products as sunscreens. The fact is, these “protective” agents are actually accelerating the creation of cancer within our bodies because they are toxic chemicals.

Researchers at the Environmental Working Group, a Washington-based nonprofit, released a report confirming nearly half of the 500 most popular sunscreen products actually increase the speed at which malignant cells develop and spread skin cancer because they contain vitamin A and its derivatives, retinol and retinyl palmitate. These substances have been known to be cancer causing and toxic for years by the FDA but they simply have not taken any action in notifying the public of the dangers.

We must also look at the vitamin D factor here. We all have been told the sun provides vitamin D for us as it is absorbed through the skin. This is true. However, if you are wearing sunscreen, you body is unable to take in any vitamin D. How many times have you gone out in the sun to get some fresh air and an intake of vitamin D and put on sunscreen? For many people, it is probably most of the time. This process increases the risk of cancer not just because a cancer causing chemical is being applied to the skin, but because vitamin D is responsible for the decrease of 4 out of 5 cancers of every kind as well, as many other diseases when levels are properly maintained in the body.

So if you are the type of person that spends a lot of time in the sun and wants to avoid burning the skin, you have many of simple options that will not cause you any harm. First off, nourish your body and skin by having the proper vitamins and antioxidants. If you are deficient in antioxidants and vitamin B, you are much more likely to get a sun burn as your skin is not healthy enough to take in the UV rays properly. Those with darker skin are able to have their skin act as a built in sunscreen which means they will have a much harder time getting burnt. When out in the sun, limit your exposure to smaller increments to build your skins health up. Always use sunscreen as a last resort when out in the sun for a long time. When shopping for sunscreens be sure to read the labels and avoid buying sunscreens loaded with toxic chemicals. Look our for oxybenzone and retinyl palmitate. It may be tough to find but a trip to a natural health store can often do the trick. Look for sunscreens that contain zinc and titanium minerals as opposed to toxic chemicals as listed above. Only use sunscreens when absolutely necessary!


Ref.

Melanoma epidemic: a midsummer night’s dream?
N.J. Levell1,
C.C. Beattie2,
S. Shuster1 and
D.C. Greenberg2

Article first published online: 9 JUN 2009
DOI: 10.1111/j.1365-2133.2009.09299.x

Medical Records State: The Sun Has Never Been Directly Linked to Skin Cancer Case December 18, 2011 by Joe Martino
http://www.collective-evolution.com/2011/12/18/the-sun-cancer-myth-the-sun-causes-skin-cancer/

Sunlight alone does not cause skin cancer: The truth you've never been told
Aticle - http://www.naturalnews.com/029146_sun_exposure_skin_cancer.html#ixzz3P41v4xBw
Video - http://tv.naturalnews.com/v.asp?v=5A62FC73922FD51A88E62E42C5A0AD5E

Sunlight Does Not Cause Skin Cancer - Dr. Michael Teplitsky MD










Sunscreen Causes, Not Prevents Skin Cancer
https://www.youtube.com/watch?v=5Rym0ZcdI5c&spfreload=10

Myth: Sunlight Causes Skin Cancer
http://www.canhealyourself.com/Myth_Sunlight_Causes_Cancer.htm

Scientists Blow The Lid on Cancer & Sunscreen Myth
http://theunboundedspirit.com/scientists-blow-the-lid-on-cancer-sunscreen-myth/

Don't let the phoney melanoma scare keep you out of the sun
http://www.theguardian.com/commentisfree/2010/jul/21/melanoma-myth-skin-cancer-sun

THE DANGERS OF TOXIC SUNSCREENS REVEALED!









Published on Jun 25, 2016
Dr. Group joins the Show to talk about the dangers of toxic Sunscreens.



Last Edit by Palmerston
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

Re: Everything You Know Is Wrong - I was social-engineered
« Reply #44 on: Oct 09, 2017, 06:03:28 pm »
 

Brocke

  • Global Moderator
  • InfoWarrior
  • *****
  • 178
    Posts
  • I am not a number, I am a free man!
Example 44

Common knowlege: There are 3 blood types A, B, O, with Rh and positive/negative subtypes

Truth: Actually there are more than 30 different blood group systems and over 100 blood subtypes within the ABO system alone



The number of blood types there are depend on the blood group system you use.

There are a number of different blood group systems, with the International Society of Blood Transfusion recognizing up to 30 major group systems. The two main blood group systems are ABO antigens and RhD (Rhesus) antigens. Most antigens are protein molecules that are situated on the surface of the red blood cells, it is with these two antigens that blood types are classified.

Researchers at the University of Vermont have discovered two new proteins on red blood cells that confirm the testable existence of two new blood types. It's an important discovery, one that'll greatly reduce the risk of incompatible blood transfusions among tens of thousands of people. But what we were more struck by in this press release was the fact that these two new blood types - named Junior and Langereis - bring the total number of recognized blood types up to 32. 32!

Turns out there's much more than just A, B, AB, and O: there are now 28 other, rarer types, often named after the person in whom they were discovered. These rarer types are identified by the presence of a particular group of antigens (substances that tell your immune system to send out antibodies), and many, like the Kell and MNS blood types, can actually be concurrent with more common blood types like A or O.
https://popsci.com.au/science/there-are-way-more-blood-types-than-you-think,377209

Background
Your blood is typed, or classified, according to the presence or absence of certain markers (antigens) found on red blood cells and in the plasma that allow your body to recognize blood as its own.

The ABO system consists of A, B, AB, and O blood types. People with type A have antibodies in the blood against type B. People with type B have antibodies in the blood against type A. People with AB have no anti-A or anti-B antibodies. People with type O have both anti-A and anti-B antibodies. People with type AB blood are called universal recipients, because they can receive any of the ABO types. People with type O blood are called universal donors, because their blood can be given to people with any of the ABO types.

There are over 100 other blood subtypes. Most have little or no effect on blood transfusions, but a few of them may be the main causes of mild transfusion reactions. Mild transfusion reactions are frightening, but they are rarely life-threatening when treated quickly.

Mild hemolytic transfusion reactions can happen when there is a mismatch of one of the more than 100 minor blood types. Most of the time, these reactions to the minor blood types are less serious than a mismatch of the ABO or Rh blood types.



Last Edit by Palmerston
Are you in earnest resolved never to barter your liberty for the lordly servitude of a court, but to live free, fearless, and independent? Then never enter the place from whence so few have been able to return; never come within the circle of ambition; nor ever bring yourself into comparison with those masters of the earth who have already engrossed the attention of half mankind before you.
 

 

Powered by EzPortal