Be Healthy - Eat Malungay

Cellphone Effects

CELL PHONES & WIRELESS DANGERS

The Largest Biological Experiment Ever
By Arthur Firstenberg

(original in Eldorado Sun- http://www.eldoradosun.com/Archives/01-06_issue/Firstenberg.htm)


Fundamentals

The most basic fact about cell phones and cell towers is that they emit microwave radiation; so do Wi-Fi (wireless Internet) antennas, wireless computers, cordless (portable) phones and their base units, and all other wireless devices. If it's a communication device and it's not attached to the wall by a wire, it's emitting radiation. Most Wi-Fi systems and some cordless phones operate at the exact same frequency as a microwave oven, while other devices use a different frequency. Wi-Fi is always on and always radiating. The base units of most cordless phones are always radiating, even when no one is using the phone. A cell phone that is on but not in use is also radiating. And, needless to say, cell towers are always radiating.

Why is this a problem, you might ask? Scientists usually divide the electromagnetic spectrum into "ionizing" and "non-ionizing." Ionizing radiation, which includes x-rays and atomic radiation, causes cancer. Non-ionizing radiation, which includes microwave radiation, is supposed to be safe. This distinction always reminded me of the propaganda in George Orwell's Animal Farm: "Four legs good, two legs bad." "Non-ionizing good, ionizing bad" is as little to be trusted.

An astronomer once quipped that if Neil Armstrong had taken a cell phone to the Moon in 1969, it would have appeared to be the third most powerful source of microwave radiation in the universe, next only to the Sun and the Milky Way. He was right. Life evolved with negligible levels of microwave radiation. ** An increasing number of scientists speculate that our body's own cells, in fact, use the microwave spectrum to communicate with one another, like children whispering in the dark, and that cell phones, like jackhammers, interfere with their signaling. ** In any case, it is a fact that we are all being bombarded, day in and day out, whether we use a cell phone or not, by an amount of microwave radiation that is some ten million times as strong as the average natural background. And it is also a fact that most of this radiation is due to technology that has been developed since the 1970s.

As far as cell phones themselves are concerned, if you put one up to your head you are damaging your brain in a number of different ways. First, think of a microwave oven. A cell phone, like a microwave oven and unlike a hot shower, heats you from the inside out, not from the outside in. And there are no sensory nerve endings in the brain to warn you of a rise in temperature because we did not evolve with microwave radiation, and this never happens in nature. Worse, the structure of the head and brain is so complex and non-uniform that "hot spots" are produced, where heating can be tens or hundreds of times what it is nearby. Hot spots can occur both close to the surface of the skull and deep within the brain, and also on a molecular level.

Cell phones are regulated by the Federal Communications Commission, and you can find, in the packaging of most new phones, a number called the Specific Absorption Rate, or SAR, which is supposed to indicate the rate at which energy is absorbed by the brain from that particular model. One problem, however, is the arbitrary assumption, upon which the FCC's regulations are based, that the brain can safely dissipate added heat at a rate of up to 1 degree C per hour. Compounding this is the scandalous procedure used to demonstrate compliance with these limits and give each cell phone its SAR rating. The standard way to measure SAR is on a "phantom" consisting, incredibly, of a homogenous fluid encased in Plexiglas in the shape of a head. Presto, no hot spots! But in reality, people who use cell phones for hours per day are chronically heating places in their brain. The FCC's safety standard, by the way, was developed by electrical engineers, not doctors.

The Blood-Brain Barrier

The second effect that I want to focus on, which has been proven in the laboratory, should by itself have been enough to shut down this industry and should be enough to scare away anyone from ever using a cell phone again. I call it the “smoking gun” of cell phone experiments. Like most biological effects of microwave radiation, this has nothing to do with heating.

The brain is protected by tight junctions between adjacent cells of capillary walls, the so-called blood-brain barrier, which, like a border patrol, lets nutrients pass through from the blood to the brain, but keeps toxic substances out. Since 1988, researchers in the laboratory of a Swedish neurosurgeon, Leif Salford, have been running variations on this simple experiment: they expose young laboratory rats to either a cell phone or other source of microwave radiation, and later they sacrifice the animals and look for albumin in their brain tissue. Albumin is a protein that is a normal component of blood but that does not normally cross the blood-brain barrier. The presence of albumin in brain tissue is always a sign that blood vessels have been damaged and that the brain has lost some of its protection.

Here is what these researchers have found, consistently for 18 years: Microwave radiation, at doses equal to a cell phone’s emissions, causes albumin to be found in brain tissue. A one-time exposure to an ordinary cell phone for just two minutes causes albumin to leak into the brain. In one set of experiments, reducing the exposure level by a factor of 1,000 actually increased the damage to the blood-brain barrier, showing that this is not a dose-response effect and that reducing the power will not make wireless technology safer. And finally, in research published in June 2003, a single two-hour exposure to a cell phone, just once during its lifetime, permanently damaged the blood-brain barrier and, on autopsy 50 days later, was found to have damaged or destroyed up to 2 percent of an animal’s brain cells, including cells in areas of the brain concerned with learning, memory and movement.1 Reducing the exposure level by a factor of 10 or 100, thereby duplicating the effect of wearing a headset, moving a cell phone further from your body, or standing next to somebody else’s phone, did not appreciably change the results! Even at the lowest exposure, half the animals had a moderate to high number of damaged neurons.

The implications for us? Two minutes on a cell phone disrupts the blood-brain barrier, two hours on a cell phone causes permanent brain damage, and secondhand radiation may be almost as bad. The blood-brain barrier is the same in a rat and a human being.

These results caused enough of a commotion in Europe that in November 2003 a conference was held, sponsored by the European Union, titled “The Blood-Brain Barrier — Can It Be Influenced by RF [radio frequency]-Field Interactions?” as if to reassure the public: “See, we are doing something about this.” But, predictably, nothing was done about it, as nothing has been done about it for 30 years.

America’s Allan Frey, during the 1970s, was the first of many to demonstrate that low-level microwave radiation damages the blood-brain barrier.2 Similar mechanisms protect the eye (the blood-vitreous barrier) and the fetus (the placental barrier), and the work of Frey and others indicates that microwave radiation damages those barriers also.3 The implication: No pregnant woman should ever be using a cell phone.

Dr. Salford is quite outspoken about his work. He has called the use of handheld cell phones “the largest human biological experiment ever.” And he has publicly warned that a whole generation of cell-phone-using teenagers may suffer from mental deficits or Alzheimer’s disease by the time they reach middle age.

Radio-Wave Sickness

Unfortunately, cell phone users are not the only ones being injured, nor should we be worried only about the brain. The following brief summary is distilled from a vast scientific literature on the effects of radio waves (a larger spectrum which includes microwaves), together with the experiences of scientists and doctors all over the world with whom I am in contact.

Organs that have been shown to be especially susceptible to radio waves include the lungs, nervous system, heart, eyes, testes and thyroid gland. Diseases that have increased remarkably in the last couple of decades, and that there is good reason to connect with the massive increase in radiation in our environment, include asthma, sleep disorders, anxiety disorders, attention deficit disorder, autism, multiple sclerosis, ALS, Alzheimer’s disease, epilepsy, fibromyalgia, chronic fatigue syndrome, cataracts, hypothyroidism, diabetes, malignant melanoma, testicular cancer, and heart attacks and strokes in young people.

Radiation from microwave towers has also been associated with forest die-off, reproductive failure and population decline in many species of birds, and ill health and birth deformities in farm animals. The literature showing biological effects of microwave radiation is truly enormous, running to tens of thousands of documents, and I am amazed that industry spokespersons are getting away with saying that wireless technology has been proved safe or — just as ridiculous — that there is no evidence of harm.

I have omitted one disease from the above list: the illness that Caller B has, and that I have. A short history is in order here.

In the 1950s and 1960s workers who built, tested and repaired radar equipment came down with this disease in large numbers. So did operators of industrial microwave heaters and sealers. The Soviets named it, appropriately, radio wave sickness, and studied it extensively. In the West its existence was denied totally, but workers came down with it anyway. Witness congressional hearings held in 1981, chaired by then Representative Al Gore, on the health effects of radio-frequency heaters and sealers, another episode in “See, we are doing something about this,” while nothing is done.

Today, with the mass proliferation of radio towers and personal transmitters, the disease has spread like a plague into the general population. Estimates of its prevalence range up to one-third of the population, but it is rarely recognized for what it is until it has so disabled a person that he or she can no longer participate in society. You may recognize some of its common symptoms: insomnia, dizziness, nausea, headaches, fatigue, memory loss, inability to concentrate, depression, chest discomfort, ringing in the ears. Patients may also develop medical problems such as chronic respiratory infections, heart arrhythmias, sudden fluctuations in blood pressure, uncontrolled blood sugar, dehydration, and even seizures and internal bleeding.

What makes this disease so difficult to accept, and even more difficult to cope with, is that no treatment is likely to succeed unless one can also avoid exposure to its cause — and its cause is now everywhere. A 1998 survey by the California Department of Health Services indicated that at that time 120,000 Californians — and by implication 1 million Americans — were unable to work due to electromagnetic pollution.(4) The ranks of these so-called electrically sensitive are swelling in almost every country in the world, marginalized, stigmatized and ignored. With the level of radiation everywhere today, they almost never recover and sometimes take their own lives.

“They are acting as a warning for all of us,” says Dr. Olle Johansson of people with this illness. “It could be a major mistake to subject the entire world’s population to whole-body irradiation, 24 hours a day.” A neuroscientist at the famous Karolinska Institute in Stockholm, Dr. Johansson heads a research team that is documenting a significant and permanent worsening of the public health that began precisely when the second-generation, 1800 MHz cell phones were introduced into Sweden in late l997.(5,6) After a decade-long decline, the number of Swedish workers on sick leave began to rise in late 1997 and more than doubled during the next five years. During the same period of time, sales of antidepressant drugs also doubled. The number of traffic accidents, after declining for years, began to climb again in 1997. The number of deaths from Alzheimer’s disease, after declining for several years, rose sharply in 1999 and had nearly doubled by 2001. This two-year delay is understandable when one considers that Alzheimer’s disease requires some time to develop.

Uncontrolled Proliferation

If cell phones and cell towers are really deadly, have the radio and TV towers that we have been living with for a century been safe? In 2002 Örjan Hallberg and Olle Johansson coauthored a paper titled “Cancer Trends During the 20th Century,” which examined one aspect of that question.7 They found, in the United States, Sweden and dozens of other countries, that mortality rates for skin melanoma and for bladder, prostate, colon, breast and lung cancers closely paralleled the degree of public exposure to radio waves during the past hundred years. When radio broadcasting increased in a given location, so did those forms of cancer; when it decreased, so did those forms of cancer. And, a sensational finding: country by country — and county by county in Sweden — they found, statistically, that exposure to radio waves appears to be as big a factor in causing lung cancer as cigarette smoking!

Which brings me to address a widespread misconception. The biggest difference between the cell towers of today and the radio towers of the past is not their safety, but their numbers. The number of ordinary radio stations in the United States today is still less than 14,000. But cell towers and Wi-Fi towers number in the hundreds of thousands, and cell phones, wireless computers, cordless telephones and two-way radios number in the hundreds of millions. Radar facilities and emergency communication networks are also proliferating out of control. Since 1978, when the Environmental Protection Agency last surveyed the radio frequency environment in the United States, the average urban dweller’s exposure to radio waves has increased 1,000-fold, most of this increase occurring in just the last nine years.8 In the same period of time, radio pollution has spread from the cities to rest like a ubiquitous fog over the entire planet.

The vast human consequences of all this are being ignored. Since the late 1990s a whole new class of environmental refugees has been created right here in the United States. We have more and more people, sick, dying, seeking relief from our suffering, leaving our homes and our livelihoods, living in cars, trailers and tents in remote places. Unlike victims of hurricanes and earthquakes, we are not the subject of any relief efforts. No one is donating money to help us, to buy us a protected refuge; no one is volunteering to forego their cell phones, their wireless computers and their cordless phones so that we can once more be their neighbors and live among them.

The worried and the sick have not yet opened their hearts to each other, but they are asking questions. To answer caller A: No shield or headset will protect you from your cell or portable phone. There is no safe distance from a cell tower. If your cell phone or your wireless computer works where you live, you are being irradiated 24 hours a day.

To caller B: To effectively shield a house is difficult and rarely successful. There are only a few doctors in the United States attempting to treat radio wave sickness, and their success rate is poor — because there are few places left on Earth where one can go to escape this radiation and recover.

Yes, radiation comes down from satellites, too; they are part of the problem, not the solution. There is simply no way to make wireless technology safe.

Our society has become both socially and economically dependent, in just one short decade, upon a technology that is doing tremendous damage to the fabric of our world. The more entrenched we let ourselves become in it, the more difficult it will become to change our course. The time to extricate ourselves, both individually and collectively — difficult though it is already is — is now.

LET’S LOOK AT CHOCOLATE!

CHOCOLATE OR CAROB?
Let’s look at chocolate!
Methylxanthines
It is at least vaguely understood by most people that the use of beverages containing methylxanthines (rhymes with Ethel Francine) such as caffeine, theobromine and theophyllin cause physical or physiologic damage. It is not well known, however, that these ill effects are serious, sometimes calamitous, and may involve any organ or tissue from scalp to sole. The reason for this widespread damage is to be found in the chemical nature of methylxanthines, their ability to alter the very protoplasm of cells, and to attach or concentrate in cells for an unknown period of time.
Methylxanthines are found in coffee, tea, colas and chocolate. The subject of this discussion is primarily chocolate.
Effects of methylxanthines
The immediate effects of methylxanthines begin shortly after taking the drink or medication containing them, and last about four hours. After eating chocolate or drinking cocoa you may have imperfect balance, racing of the heart, high pitched voice, insomnia, fatigue, and finger tremor. Other symptoms may be delayed for hours or several days and include sleep disturbances, headache, restlessness, palpitations, tremulousness, unsteadiness, vertigo, reflex hyperexcitability, irritability, agitation, anxiety and general discomfort. (1)
If one is accustomed to the regular use of chocolate, one may feel less alert, less contented, more sleepy and irritable when there is a delay in drinking the cocoa or eating the candy. Many troublesome diseases are made worse by methylxanthines—heart disease, allergies, diabetes and fluid retention. Depression may be caused by them, and they most certainly contribute to our “violent society” with its crime and child abuse. Most gastrointestinal disturbances are aggravated and some are caused by methylxanthines. All of the methylxanthines have been associated with chromosome damage and deformities in the offspring of the user. Cancer is more common in those who use methylxanthines. Disease resistance is not as strong. And this is only a partial list!
Chocolate, Breast Disease and Prostatic Hypertrophy
Apparently all the methylxanthines step up cell growth in certain glandular tissues. Since they pose an interference in the normal activity of certain enzymes, they act as poisons. True to their chemical classification as cellular toxins, methylxanthines shut off enzyme signals, in this instance the signal to stop growing. As a result, certain glandular tissues under the influence of chocolate may begin developing cysts and fibrous tumors especially in the breast, the so-called fibrocystic disease. One young physician with full-blown fibrocystic disease was consuming a massive quantity of methylxanthines daily (1,300 milligrams) when she learned of this effect on the breasts. She stopped using coffee, tea, colas and chocolate to determine if her breast disease would disappear. Within a month, the lumps in her breast started to subside. By two months the fibrocystic disease had disappeared. It was not without a struggle, as she suffered severe withdrawal headaches that could not be relieved even by her headache medications. It seems wise to advise women everywhere to cut out the use of methylxanthines as a breast cancer control measure. (2) Many physicians believe that the effect on the male prostate is similar to that on the female breast.
Evaluation of Chocolate as Food
An evaluation of chocolate with a judgment as to its suitability as a food will result in condemning chocolate on three counts: (1) its inherent chemical toxicity, (2) the additives required to make chocolate palatable, and (3) the harvesting and primary manufacture. Let us take a look at each of these factors individually:
(1) The inherent chemical characteristics of chocolate:
Theobromine in chocolate is the principle methylxanthine, causing central nervous system stimulation, sleeplessness, general or localized itching, depression and anxiety.
All brands of cocoa contain more tannin per cup than the estimated 2 grains per average cup of tea. Tannins have been implicated in certain cancers of the digestive tract. Children who have a bedwetting problem will have more difficulty when given cocoa. Caffeine content may be as high 112 milligrams per cup of cocoa beverage. Cocoa may interfere with calcium absorption. (3) The cocoa consumed by children in the mistaken hope that the addition of cocoa and sugar will increase their calcium intake may actually tie up calcium they get from such excellent sources as whole grains, legumes and greens. Chocolate contains 0.45 to 0.49% oxalic acid. The oxalic acid combines with calcium to form an insoluble compound, calcium oxalate, which passes out of the body unabsorbed. (4)
(2) Additives required to mask bitterness:
A bitter taste is usually associated with harmful alkaloids, pyrolysates and strongly alkaline substances. An unpleasant taste sensation is a warning signal that something potentially injurious is in the mouth. Masking the injurious agent with sugar does not eliminate the danger.
A large amount of sugar is necessary to make chocolate palatable. Furthermore, oils must be combined with chocolate in order to eliminate an unpleasant grainy consistency in chocolate. Generally, milk, cream, or oil are added which produces an extremely rich and unhealthful food. Any reasonable quantity eaten is certain to obstruct digestion and cause fermentation.
(3) The natural contaminants in chocolate:
Most cocoa beans are produced in countries where sanitation levels are far below those generally practice in the United States.
The cacao is a small, beautiful tree indigenous to the tropical regions of the world, where millions of pounds of chocolate, milk chocolate and cocoa powder are produced annually. Cocoa is defined as the food prepared by heating and cracking the beans from the cacao tree. Chocolate is the solid or semi-plastic food prepared by finely grinding cocoa. It must have a minimum of 50% fat.
The pods are cut from the tree, piled up in the yard of the farmer, and fermented, a process which takes from three to eight days. During this process, people walk over the piles; insects, rodents, small animals and other living things make their nests in piles; and many types of contamination may occur during this primary part of the manufacture of chocolate. At the peak of fermentation the temperature builds up, which promotes the growth of bacteria and molds. It has been shown that large quantities of aflatoxin, the cancer producing agent from the molds, can be produced in cocoa beans. (4) The fermentation is essential for the development of the chocolate flavor. During the fermentation process the bean’s own enzymes and wild yeasts enhance the fermentation process. After fermentation the seeds are sun dried or kiln dried and then are ready to be shipped to the chocolate manufacturers where they are roasted and ground to make a chocolate “liquor” somewhat like soft peanut butter. In this stage bacterial contaminants multiply. (9)
Since sugar and fat both tend to exude from candy, additives are placed in candy to prevent the surfacing of these materials. Rancidity of the fats can usually be detected after storage at 86 degrees F. from six to twelve weeks. The unpleasant flavor heralds the presence of the harmful change that occurs with aging of fats. Rancidity can be delayed by certain additives. Whipping agents and other additives provide lightness of texture. (5)
In a booklet published by the United States Dept. of Health, Education and Welfare entitled “The Food Defect Action Levels,” a specifications listing of “current levels for natural or unavoidable defects in food” lists the natural defect levels in chocolate in the form of “insects, rodents and other natural contaminants” that are allowable by the Food and Drug Administration. Allowed in chocolate and chocolate liquor used in the manufacture of such products as chocolate bars, up to 120 insect fragments per cup or two rodent hairs per cup.
Four percent of cocoa beans may be infested with insects and still carry the blessing of the FDA. Visible or solid animal excreta must not exceed 10 milligrams per pound. For chocolate powder or pressed cakes there must not be more than 75 insect fragments in 3 tablespoons of the powder!
Many individuals who believe themselves to be allergic to chocolate are in fact allergic to the animal parts that are in chocolate. One 11 year old boy was hospitalized for abdominal pain and vomiting blood. He had suddenly developed purpura, tiny spots of hemorrhage in the skin all over the body. It was discovered while he was in the hospital that his attacks of skin hemorrhage and abdominal pain could be brought on within a few minutes of giving chocolate, either by mouth or by scratch test in the skin. (6) Chocolate is also a common cause of “pruritis ani,” an uncomfortable itch around the anus, the terminal part of the colon. Stopping the use of chocolate results in prompt cessation of the itch. (6)
Should you like to have further information about this matter you may obtain materials from FDA Guidelines and Compliance Branch, Bureau of Foods, 200 C Street, SW, Washington, DC 20204.
It seems uncanny that chocolate could ever have gotten to be considered a special food for children. The Ladies Home Journal way back in October 1930 carried an ad from Baker’s Cocoa that read, “The weekly treat became a daily delight and Jimmy’s weight went up.” What a shame that children have ever been allowed to have any product from cocoa. Even though chocolate might induce children to drink more milk and eat more empty calories from sugar and fat, in mice experiments the extra milk does not result in improvement in nutrition, but only makes their body fat greater! (6)
As for me, any one of the above features would banish chocolate from my dietary forever. Fortunately for us chocolate lovers, a good substitute is available that has a much more favorable manufacture and a greater likelihood of being processed under more sanitary conditions—carob. On all three counts it is a better product than chocolate. It contains no methylxanthines. It does not require sugar, being naturally slightly sweet. And most esthetically, it does not require fermentation to develop its flavor. I recommend it as being superior.
Agatha M. Thrash, MD
What is carob?
Carob comes from the carob tree (botanical name Ceratonia siliqua) which is grown mostly in the Middle East. The carob tree belongs to the legume family and produces long pods which are dried and ground finely to produce carob powder.
Referred to in the Bible, the carob pods helped sustain John the Baptist during his wilderness sojourn and are often called St. John’s Bread.
In ancient times, carob pods were considered to be very valuable because when used as feed for cattle and sheep, the animals flourished. The dried seeds from the pod were often used in trade like money. From this the word “carat” (Gk. keration carob bean) is derived, which is still used today by jewelers in describing gold.
What is the nutritional value of carob?
· Almost 8% protein (like other legumes) compared with 2-14% protein in pre-packaged, sugar-laden cereals sold in supermarkets.
· Contains a great deal of natural sugar: about 46%.
· Contains the minerals calcium, magnesium, and potassium.
· Contains some trace minerals such as iron, manganese, chromium, copper, and nickel.
Carob is not an “empty-calorie” food!
Compared with chocolate, carob is three times richer in calcium, but has one-third less calories and seventeen times less fat.
Carob is also a rich source of pectin, which aids in digestion and elimination. Pectin is in the group of indigestible complex carbohydrates commonly called fiber. It is the substance which makes jams and jellies set up. The pectin in carob is useful for arresting simple diarrhea, nausea and vomiting. Pectin settles the stomach, and like all fiber, helps take up poisons and toxins in the intestine and eliminate them from the body. Use a 5% concentration, about 1 Tbsp. of carob powder to a cup of liquid; or simply make a paste of carob powder and water. Lignin is another member of the fiber family found in the “woody” part of the carob pod. Both the pectin and lignin have a beneficial cholesterol-lowering effect characteristic of dietary fiber.
Therefore, in order to achieve a fine “chocolaty” flavor with excellent nutritional value and the absence of harmful stimulation, use carob powder in any recipe calling for cocoa or chocolate.
References
1. Psychopharmacology in the Practice of Medicine by Murray E. Jawik. Reviewed in Journal of Family Practice 4(6):1180-1888, 1977.
2. Medical World News, March 19, 1979.
3. Chocolate, Coca Cola, Cocoa and Coffee. International Nutrition Research Foundation, Riverside, California.
4. Journal of the Association of Official Analytical Chemists 62(5):1076-9, September 1979.
5. Gott, Phillip P. All about Candy and Chocolate. Chicago, IL: National Confectioners Association, 1958.
6. American Journal of Clinical Nutrition 6(2), 1960.
7. American Journal of Surgery, November 1951.
8. Journal of the American Dietetic Association 32(12):1171-4, December 1956.
9. Applied Microbiology 20:644-654, October 1970.

check this out!!!! earn money in simple way

Earn $2000/month via part time jobs. Easy form filling data entry jobs

Earn $1500-2500 per month from home. No marketing / No MLM .
We are offering a rare Job opportunity where you can earn working from home using your computer and the Internet part-time. Qualifications required are Typing on the Computer only. You can even work from a Cyber Café or your office PC, if so required. These part time jobs require working for only 1-2 hours/day to easily fetch you $1500-2500 per month. Online jobs, Part time jobs. Work at home jobs. Dedicated workers make much more as the earning potential is unlimited. No previous experience is required, full training provided. Anyone in any country can apply. Please Visit http://www.earnparttimejobs.com/index.php?id= 3007739