Sunday, April 26, 2009


Science Facts
4.5 pounds of sunlight strike the Earth each day.

A 'jiffy' is an actual unit of time for 1/100th of a second!


A ball of glass will bounce higher than a ball made of rubber.

A cesium atom in an atomic clock beats over nine billion times a second.


A cubic yard of air weighs about 2 pounds at sea level.

A lump of pure gold the size of a matchbox can be flattened into a sheet the size of a tennis court.


A single cup of gasoline, when ignited, has the same explosive power as five sticks of dynamite.


According to studies, men change their minds two to three times more often than women.


Airports that are at higher altitudes require a longer airstrip due to lower air density.


All Humans Are 99.9% Genetically Identical and 98.4% of human genes are the same as the genes of a chimpanzee.


Aluminum used to be more valuable than gold!


Approximately 70 percent of the earth is covered by water. Only 1 percent of this water is drinkable.


At room temperature, the average air molecule travels at the speed of a rifle bullet.


Balsa wood is classified as a hard wood!

Computer Facts


Computer Facts


Another name for a Microsoft Windows tutorial is 'Crash Course'!

Bill Gates' house was designed using a Macintosh computer.

By the year 2012 there will be approximately 17 billion devices connected to the Internet.

Domain names are being registered at a rate of more than one million names every month.

E-mail has been around longer than the World Wide Web.

For every 'normal' webpage, there are five porn pages.

In the 1980s, an IBM computer wasn't considered 100 percent compatible unless it could run Microsoft Flight Simulator*.

MySpace reports over 110 million registered users. Were it a country, it would be the tenth largest, just behind Mexico.

One of every 8 married couples in the US last year met online.

The average 21 year old has spent 5,000 hours playing video games, has exchanged 250,000 e-mails, instant and text messages and has spent 10,000 hours on the mobile phone.

The average computer user blinks 7 times a minute, less than half the normal rate of 20.

The first banner advertising was used in 1994.

The first computer mouse was invented by Doug Engelbart in around 1964 and was made of wood.

FUN USELESS FACTS 1-10


FUN USELESS FACTS 1-10


The word 'byte' is a contraction of 'by eight.'

The dollar symbol ($) is a U combined with an S (U.S.)

Maine is the toothpick capital of the world.

Barbie's measurements if she were life size: 39-23-33.

The tune for the "A-B-C" song is the same as "Twinkle, Twinkle Little Star."

Des Moines has the highest per capita Jello consumption in the U.S.

Pinocchio is Italian for "pine head."

In the last 4000 years, no new animals have been domesticated.

Great Britain was the first county to issue postage stamps. Hence, the postage stamps of Britain are the only stamps in the world not to bear the name of the country of origin.

Since 1896, the beginning of the modern Olympics, only Greece and Australia have participated in every Games.

FUN FOOD FACTS 1-10


FUN FOOD FACTS 1-10

Hershey's Kisses are called that because the machine that makes them looks like it's kissing the conveyor belt.

Reindeer like to eat bananas.


Maine is the toothpick capital of the world.


Every year, kids in North America spend close to half a billion dollars on chewing gum.


American's eat about 18 billion hot dogs a year.


The oldest piece of chewing gum is 9000 years old.


The man who played the voice of bugs bunny was allergic to carrots.

Apples are more effective at keeping people awake in the morning than caffeine.

Every time you lick a stamp you gain 1/10 of a calorie.

Yams have 10 times more vitamin C than sweet potatoes.

FUN DID YOU KNOW FACTS 1-10


FUN DID YOU KNOW FACTS 1-10


Did You Know, Donald Duck comics were banned from Finland because he doesn't wear pants.

Did You Know, The cigarette lighter was invented before the match.
Did You Know, 40 percent of McDonald's profits come from the sales of Happy Meals.

Did You Know, TYPEWRITER is the longest word that can be made using the letters only on one row of the keyboard.
Did You Know, Butterflies taste with their feet.

Did You Know, Coca-Cola was originally green.

Did You know, If you yelled for 8 years, 7 months and 6 days, you would have produced enough sound energy to heat one cup of coffee. Did You Know, Every day more money is printed for Monopoly than the US Treasury.
Did You Know, It is impossible to lick your elbow.

Did You Know, Intelligent people have more zinc and copper in their hair.

Interesting Olympic Facts


Interesting Olympic Facts


The Official Olympic Flag
Created by Pierre de Coubertin in 1914, the Olympic flag contains five interconnected rings on a white background. The five rings symbolize the five significant continents and are interconnected to symbolize the friendship to be gained from these international competitions. The rings, from left to right, are blue, yellow, black, green, and red. The colors were chosen because at least one of them appeared on the flag of every country in the world. The Olympic flag was first flown during the 1920 Olympic Games.

The Olympic Motto

In 1921, Pierre de Coubertin, founder of the modern Olympic Games, borrowed a Latin phrase from his friend, Father Henri Didon, for the Olympic motto: Citius, Altius, Fortius ("Swifter, Higher, Stronger").
The Olympic Oath

Pierre de Coubertin wrote an oath for the athletes to recite at each Olympic Games. During the opening ceremonies, one athlete recites the oath on behalf of all the athletes. The Olympic oath was first taken during the 1920 Olympic Games by Belgian fencer Victor Boin. The Olympic Oath states, "In the name of all competitors, I promise that we shall take part in these Olympic Games, respecting and abiding by the rules that govern them, in the true spirit of sportsmanship, for the glory of sport and the honor of our teams."
The Olympic Creed

Pierre de Coubertin got the idea for this phrase from a speech given by Bishop Ethelbert Talbot at a service for Olympic champions during the 1908 Olympic Games. The Olympic Creed reads: "The most important thing in the Olympic Games is not to win but to take part, just as the most important thing in life is not the triumph but the struggle. The essential thing is not to have conquered but to have fought well."

The Olympic Flame

The Olympic flame is a practice continued from the ancient Olympic Games. In Olympia (Greece), a flame was ignited by the sun and then kept burning until the closing of the Olympic Games. The flame first appeared in the modern Olympics at the 1928 Olympic Games in Amsterdam. The flame itself represents a number of things, including purity and the endeavor for perfection. In 1936, the chairman of the organizing committee for the 1936 Olympic Games, Carl Diem, suggested what is now the modern Olympic Torch relay. The Olympic flame is lit at the ancient site of Olympia by women wearing ancient-style robes and using a curved mirror and the sun. The Olympic Torch is then passed from runner to runner from the ancient site of Olympia to the Olympic stadium in the hosting city. The flame is then kept alight until the Games have concluded. The Olympic Torch relay represents a continuation from the ancient Olympic Games to the modern Olympics.
The Olympic Hymn

The Olympic Hymn, played when the Olympic Flag is raised, was composed by Spyros Samaras and the words added by Kostis Palamas. The Olympic Hymn was first played at the 1896 Olympic Games in Athens but wasn't declared the official hymn by the IOC until 1957.

Real Gold Medals

The last Olympic gold medals that were made entirely out of gold were awarded in 1912.

The Medals

The Olympic medals are designed especially for each individual Olympic Games by the host city's organizing committee. Each medal must be at least three millimeters thick and 60 millimeters in diameter. Also, the gold and silver Olympic medals must be made out of 92.5 percent silver, with the gold medal covered in six grams of gold.

The First Opening Ceremonies

The first opening ceremonies were held during the 1908 Olympic Games in London.

Opening Ceremony Procession Order

During the opening ceremony of the Olympic Games, the procession of athletes is always led by the Greek team, followed by all the other teams in alphabetical order (in the language of the hosting country), except for the last team which is always the team of the hosting country.

A City, Not a Country

When choosing locations for the Olympic Games, the IOC specifically gives the honor of holding the Games to a city rather than a country.

IOC Diplomats

In order to make the IOC an independent organization, the members of the IOC are not considered diplomats from their countries to the IOC, but rather are diplomats from the IOC to their respective countries.

First Modern Champion

James B. Connolly (United States), winner of the hop, step, and jump (the first final event in the 1896 Olympics), was the first Olympic champion of the modern Olympic Games.

The First Marathon

In 490 BCE, Pheidippides, a Greek soldier, ran from Marathon to Athens (about 25 miles) to inform the Athenians the outcome of the battle with invading Persians. The distance was filled with hills and other obstacles; thus Pheidippides arrived in Athens exhausted and with bleeding feet. After telling the townspeople of the Greeks' success in the battle, Pheidippides fell to the ground dead. In 1896, at the first modern Olympic Games, held a race of approximately the same length in commemoration of Pheidippides.

The Exact Length of a Marathon

During the first several modern Olympics, the marathon was always an approximate distance. In 1908, the British royal family requested that the marathon start at the Windsor Castle so that the royal children could witness its start. The distance from the Windsor Castle to the Olympic Stadium was 42,195 meters (or 26 miles and 385 yards). In 1924, this distance became the standardized length of a marathon.

Women

Women were first allowed to participate in 1900 at the second modern Olympic Games.

Winter Games Begun

The winter Olympic Games were first held in 1924, beginning a tradition of holding them a few months earlier and in a different city than the summer Olympic Games. Beginning in 1994, the winter Olympic Games were held in completely different years (two years apart) than the summer Games.
Cancelle Games

Because of World War I and World War II, there were no Olympic Games in 1916, 1940, or 1944.

Tennis Banned

Tennis was played at the Olympics until 1924, then reinstituted in 1988.
Walt Disney

In 1960, the Winter Olympic Games were held in Squaw Valley, California (United States). In order to bedazzle and impress the spectators, Walt Disney was head of the committee that organized the opening day ceremonies. The 1960 Winter Games Opening Ceremony was filled with high school choirs and bands, releasing of thousands of balloons, fireworks, ice statues, releasing of 2,000 white doves, and national flags dropped by parachute.
Russia Not Present

Though Russia had sent a few athletes to compete in the 1908 and 1912 Olympic Games, they did not compete again until the 1952 Games.

Motor Boating

Motor boating was an official sport at the 1908 Olympics.

Polo, an Olympic Sport

Polo was played at the Olympics in 1900, 1908, 1920, 1924, and 1936.

Gymnasium

The word "gymnasium" comes from the Greek root "gymnos" meaning nude; the literal meaning of "gymnasium" is "school for naked exercise." Athletes in the ancient Olympic Games would participate in the nude.

Stadium

The first recorded ancient Olympic Games were held in 776 BCE with only one event - the stade. The stade was a unit of measurement (about 600 feet) that also became the name of the footrace because it was the distance run. Since the track for the stade (race) was a stade (length), the location of the race became the stadium.
Counting Olympiads

An Olympiad is a period of four successive years. The Olympic Games celebrate each Olympiad. For the modern Olympic Games, the first Olympiad celebration was in 1896. Every four years celebrates another Olympiad; thus, even the Games that were cancelled (1916, 1940, and 1944) count as Olympiads. The 2004 Olympic Games in Athens was called the Games of the XXVIII Olympiad.

Friday, April 17, 2009

Make a Battery from Potato

Make a Battery from Potato
Introduction:
Batteries generate electricity through a chemical reaction between two different electrodes and one electrolyte. Use of Copper and Zinc electrodes and Sulfuric acid as electrolyte is a proven method for this process. We are wondering if we can use any other liquid as electrolyte? This gave us the idea of using a potato as electrolyte. After all a fresh potato has a lot of juice that may serve our purpose as electrolyte.
Problem:
Can Potato be used to generate electricity?
Hypothesis:
Potato juice contains many water soluble chemicals that may cause a chemical reaction with one or both of our electrodes. So we may get some electricity from that.

Material:
For this experiment we use:
A fresh potato
Copper Electrode
Zinc Electrode
A Digital or Analog Multimeter to measure Voltage or Current of produced electricity.
Alligator clips/ Leads

Procedure:
We insert copper and zinc electrodes in to the potato, close but not touching each other. We use Clip leads to connect our electrodes to the Multimeter to measure voltage between two electrodes or current passing through the multimeter. For this experiment we removed the shell of a broken AA battery for our Zinc electrode. (Make sure to test your multimeter by connecting it's Positive and Negative wires to each other that should show no current and no voltage).


A digital multimeter showed 1.2 volts between the electrodes, but the analog multimeter showed a much smaller value. In other words even though the voltage between electrodes is 1.2 Volts, the speed of production of electricity is not high enough for an analog multimeter to show the exact voltage. (Analog multimeter gets it's power from our potato to show the voltage, but digital Multimeter gets it's power from an internal battery and does not consume any of the electricity produced by our potato, that is why it shows a larger and more accurate value).We repeated this experiment with some other fruits and all resulted almost the same. In all cases the produced voltage is between 1 and 1.5 volts, and in all cases they do not produce enough current to turn on a small light.
Another thing that we learned from this experiment is that creating electricity and making a battery is easy, the main challenge is producing a battery that can continue to produce larger amount of electricity for larger amount of time.

How do animals spend the winter?



The weather gets colder, days get shorter and leaves turn color and fall off the trees. Soon, winter is here. Snow covers the ground. People live in warm houses and wear heavy coats outside. Our food comes from the grocery store. But what happens to the animals?
MIGRATE

Animals do many different, amazing things to get through the winter. Some of them "migrate." This means they travel to other places where the weather is warmer or they can find food.
Many birds migrate in the fall. Because the trip can be dangerous, some travel in large flocks. For example, geese fly in noisy, "V"-shaped groups. Other kinds of birds fly alone.
How do they know when it is time to leave for the winter? Scientists are still studying this. Many see migration as part of a yearly cycle of changes a bird goes through. The cycle is controlled by changes in the amount of daylight and the weather.

Birds can fly very long distances. For example, the Arctic tern nests close to the North Pole in the summer. In autumn, it flys south all the way to Antarctica. Each spring it returns north again.
Most birds migrate shorter distances. But how do they find their way to the same place each year? Birds seem to navigate like sailors once did, using the sun, moon and stars for direction. They also seem to have a compass in their brain for using the Earth's magnetic field.
Other animals migrate, too. There are a few mammals, like some bats, caribou and elk, and whales that travel in search of food each winter. Many fish migrate. They may swim south, or move into deeper, warmer water.

Insects also migrate. Some butterflies and moths fly very long distances. For example, Monarch butterflies spend the summer in Canada and the Northern U.S. They migrate as far south as Mexico for the winter. Most migrating insects go much shorter distances. Many, like termites and Japanese beetles, move downward into the soil. Earthworms also move down, some as far as six feet below the surface.

ADAPT

Some animals remain and stay active in the winter. They must adapt to the changing weather. Many make changes in their behavior or bodies. To keep warm, animals may grow new, thicker fur in the fall. On weasels and snowshoe rabbits, the new fur is white to help them hide in the snow.
Food is hard to find in the winter. Some animals, like squirrels, mice and beavers, gather extra food in the fall and store it to eat later. Some, like rabbits and deer, spend winter looking for moss, twigs, bark and leaves to eat. Other animals eat different kinds of food as the seasons change. The red fox eats fruit and insects in the spring, summer and fall. In the winter, it can not find these things, so instead it eats small rodents.

Animals may find winter shelter in holes in trees or logs, under rocks or leaves, or underground. Some mice even build tunnels through the snow. To try to stay warm, animals like squirrels and mice may huddle close together.

Certain spiders and insects may stay active if they live in frost-free areas and can find food to eat. There are a few insects, like the winter stone fly, crane fly, and snow fleas, that are normally active in winter. Also, some fish stay active in cold water during the winter.

HIBERNATE

Some animals "hibernate" for part or all of the winter. This is a special, very deep sleep. The animal's body temperature drops, and its heartbeat and breathing slow down. It uses very little energy. In the fall, these animals get ready for winter by eating extra food and storing it as body fat. They use this fat for energy while hibernating. Some also store food like nuts or acorns to eat later in the winter. Bears, skunks, chipmunks, and some bats hibernate.

AND MORE
Cold-blooded animals like fish, frogs, snakes and turtles have no way to keep warm during the winter. Snakes and many other reptiles find shelter in holes or burrows, and spend the winter inactive, or dormant. This is similar to hibernation.
Water makes a good shelter for many animals. When the weather gets cold, they move to the bottom of lakes and ponds. There, frogs, turtles and many fish hide under rocks, logs or fallen leaves. They may even bury themselves in the mud. They become dormant. Cold water holds more oxygen than warm water, and the frogs and turtles can breath by absorbing it through their skin.

Insects look for winter shelter in holes in the ground, under the bark of trees, deep inside rotting logs or in any small crack they can find. One of the most interesting places is in a gall. A gall is a swelling on a plant. It is caused by certain insects, fungi or bacteria. They make a chemical that affects the plant's growth in a small area, forming a lump. The gall becomes its maker's home and food source.

Every type of insect has its own life cycle, which is the way it grows and changes. Different insects spend the winter in different stages of their lives. Many insects spend the winter dormant, or in "diapause." Diapause is like hibernation. It is a time when growth and development stop. The insect's heartbeat, breathing and temperature drop. Some insects spend the winter as worm-like larvae. Others spend the winter as pupae. (This is a time when insects change from one form to another.) Other insects die after laying eggs in the fall. The eggs hatch into new insects in the spring and everything begins all over again.

Why is the sky blue?

Why is the sky blue?
On a clear sunny day, the sky above us looks bright blue. In the evening, the sunset puts on a brilliant show of reds, pinks and oranges. Why is the sky blue? What makes the sunset red?
To answer these questions, we must learn about light, and the Earth's atmosphere.
THE ATMOSPHERE
The atmosphere is the mixture of gas molecules and other materials surrounding the earth. It is made mostly of the gases nitrogen (78%), and oxygen (21%). Argon gas and water (in the form of vapor, droplets and ice crystals) are the next most common things. There are also small amounts of other gases, plus many small solid particles, like dust, soot and ashes, pollen, and salt from the oceans.
The composition of the atmosphere varies, depending on your location, the weather, and many other things. There may be more water in the air after a rainstorm, or near the ocean. Volcanoes can put large amounts of dust particles high into the atmosphere. Pollution can add different gases or dust and soot.
The atmosphere is densest (thickest) at the bottom, near the Earth. It gradually thins out as you go higher and higher up. There is no sharp break between the atmosphere and space.
LIGHT WAVES

Light is a kind of energy that radiates, or travels, in waves. Many different kinds of energy travel in waves. For example, sound is a wave of vibrating air. Light is a wave of vibrating electric and magnetic fields. It is one small part of a larger range of vibrating electromagnetic fields. This range is called the electromagnetic spectrum.
Electromagnetic waves travel through space at 299,792 km/sec (186,282 miles/sec). This is called the speed of light.

The energy of the radiation depends on its wavelength and frequency. Wavelength is the distance between the tops (crests) of the waves. Frequency is the number of waves that pass by each second. The longer the wavelength of the light, the lower the frequency, and the less energy it contains.
COLORS OF LIGHT

Visible light is the part of the electromagnetic spectrum that our eyes can see. Light from the sun or a light bulb may look white, but it is actually a combination of many colors. We can see the different colors of the spectrum by splitting the light with a prism. The spectrum is also visible when you see a rainbow in the sky.

The colors blend continuously into one another. At one end of the spectrum are the reds and oranges. These gradually shade into yellow, green, blue, indigo and violet. The colors have different wavelengths, frequencies, and energies. Violet has the shortest wavelength in the visible spectrum. That means it has the highest frequency and energy. Red has the longest wavelength, and lowest frequency and energy.
LIGHT IN THE AIR
Light travels through space in a straight line as long as nothing disturbs it. As light moves through the atmosphere, it continues to go straight until it bumps into a bit of dust or a gas molecule. Then what happens to the light depends on its wave length and the size of the thing it hits.
Dust particles and water droplets are much larger than the wavelength of visible light. When light hits these large particles, it gets reflected, or bounced off, in different directions. The different colors of light are all reflected by the particle in the same way. The reflected light appears white because it still contains all of the same colors.
Gas molecules are smaller than the wavelength of visible light. If light bumps into them, it acts differently. When light hits a gas molecule, some of it may get absorbed. After awhile, the molecule radiates (releases, or gives off) the light in a different direction. The color that is radiated is the same color that was absorbed. The different colors of light are affected differently. All of the colors can be absorbed. But the higher frequencies (blues) are absorbed more often than the lower frequencies (reds). This process is called Rayleigh scattering. (It is named after Lord John Rayleigh, an English physicist, who first described it in the 1870's.)
WHY IS THE SKY BLUE?
The blue color of the sky is due to Rayleigh scattering. As light moves through the atmosphere, most of the longer wavelengths pass straight through. Little of the red, orange and yellow light is affected by the air.
However, much of the shorter wavelength light is absorbed by the gas molecules. The absorbed blue light is then radiated in different directions. It gets scattered all around the sky. Whichever direction you look, some of this scattered blue light reaches you. Since you see the blue light from everywhere overhead, the sky looks blue.

As you look closer to the horizon, the sky appears much paler in color. To reach you, the scattered blue light must pass through more air. Some of it gets scattered away again in other directions. Less blue light reaches your eyes. The color of the sky near the horizon appears paler or white.

THE BLACK SKY AND WHITE SUN
On Earth, the sun appears yellow. If you were out in space, or on the moon, the sun would look white. In space, there is no atmosphere to scatter the sun's light. On Earth, some of the shorter wavelength light (the blues and violets) are removed from the direct rays of the sun by scattering. The remaining colors together appear yellow.
Also, out in space, the sky looks dark and black, instead of blue. This is because there is no atmosphere. There is no scattered light to reach your eyes.

WHY IS THE SUNSET RED?
As the sun begins to set, the light must travel farther through the atmosphere before it gets to you. More of the light is reflected and scattered. As less reaches you directly, the sun appears less bright. The color of the sun itself appears to change, first to orange and then to red. This is because even more of the short wavelength blues and greens are now scattered. Only the longer wavelengths are left in the direct beam that reaches your eyes.


The sky around the setting sun may take on many colors. The most spectacular shows occur when the air contains many small particles of dust or water. These particles reflect light in all directions. Then, as some of the light heads towards you, different amounts of the shorter wavelength colors are scattered out. You see the longer wavelengths, and the sky appears red, pink or orange.
Expert Web Programmer for an MIT startup at Localot (Cambridge, MA 02139). See this and other great job listings on the jobs page. Joel on Software Advice for Computer Science College Studentsby Joel SpolskySunday, January 02, 2005Despite the fact that it was only a year or two ago that I was blubbering about how rich Windows GUI clients were the wave of the future, college students nonetheless do occasionally email me asking for career advice, and since it's recruiting season, I thought I'd write up my standard advice which they can read, laugh at, and ignore.
Most college students, fortunately, are brash enough never to bother asking their elders for advice, which, in the field of computer science, is a good thing, because their elders are apt to say goofy, antediluvian things like "the demand for keypunch operators will exceed 100,000,000 by the year 2010" and "lisp careers are really very hot right now."
I, too, have no idea what I'm talking about when I give advice to college students. I'm so hopelessly out of date that I can't really figure out AIM and still use (horrors!) this quaint old thing called "email" which was popular in the days when music came on flat round plates called "CDs."
So you'd be better off ignoring what I'm saying here and instead building some kind of online software thing that lets other students find people to go out on dates with.
Nevertheless.
If you enjoy programming computers, count your blessings: you are in a very fortunate minority of people who can make a great living doing work they enjoy. Most people aren't so lucky. The very idea that you can "love your job" is a modern concept. Work is supposed to be something unpleasant you do to get money to do the things you actually like doing, when you're 65 and can finally retire, if you can afford it, and if you're not too old and infirm to do those things, and if those things don't require reliable knees, good eyes, and the ability to walk twenty feet without being out of breath, etc.
What was I talking about? Oh yeah. Advice.
Without further ado, then, here are Joel's Seven Pieces of Free Advice for Computer Science College Students (worth what you paid for them):
Learn how to write before graduating. Learn C before graduating. Learn microeconomics before graduating. Don't blow off non-CS classes just because they're boring. Take programming-intensive courses. Stop worrying about all the jobs going to India. No matter what you do, get a good summer internship.Now for the explanations, unless you're gullible enough to do all that stuff just because I tell you to, in which case add: 8. Seek professional help for that self-esteem thing.
Learn how to write before graduating.
Would Linux have succeeded if Linus Torvalds hadn't evangelized it? As brilliant a hacker as he is, it was Linus's ability to convey his ideas in written English via email and mailing lists that made Linux attract a worldwide brigade of volunteers.
Have you heard of the latest fad, Extreme Programming? Well, without getting into what I think about XP, the reason you've heard of it is because it is being promoted by people who are very gifted writers and speakers.
Even on the small scale, when you look at any programming organization, the programmers with the most power and influence are the ones who can write and speak in English clearly, convincingly, and comfortably. Also it helps to be tall, but you can't do anything about that.
The difference between a tolerable programmer and a great programmer is not how many programming languages they know, and it's not whether they prefer Python or Java. It's whether they can communicate their ideas. By persuading other people, they get leverage. By writing clear comments and technical specs, they let other programmers understand their code, which means other programmers can use and work with their code instead of rewriting it. Absent this, their code is worthless. By writing clear technical documentation for end users, they allow people to figure out what their code is supposed to do, which is the only way those users can see the value in their code. There's a lot of wonderful, useful code buried on sourceforge somewhere that nobody uses because it was created by programmers who don't write very well (or don't write at all), and so nobody knows what they've done and their brilliant code languishes.
I won't hire a programmer unless they can write, and write well, in English. If you can write, wherever you get hired, you'll soon find that you're getting asked to write the specifications and that means you're already leveraging your influence and getting noticed by management.
Most colleges designate certain classes as "writing intensive," meaning, you have to write an awful lot to pass them. Look for those classes and take them! Seek out classes in any field that have weekly or daily written assignments.
Start a journal or weblog. The more you write, the easier it will be, and the easier it is to write, the more you'll write, in a virtuous circle.
Learn C before graduating Part two: C. Notice I didn't say C++. Although C is becoming increasingly rare, it is still the lingua franca of working programmers. It is the language they use to communicate with one another, and, more importantly, it is much closer to the machine than "modern" languages that you'll be taught in college like ML, Java, Python, whatever trendy junk they teach these days. You need to spend at least a semester getting close to the machine or you'll never be able to create efficient code in higher level languages. You'll never be able to work on compilers and operating systems, which are some of the best programming jobs around. You'll never be trusted to create architectures for large scale projects. I don't care how much you know about continuations and closures and exception handling: if you can't explain why while (*s++ = *t++); copies a string, or if that isn't the most natural thing in the world to you, well, you're programming based on superstition, as far as I'm concerned: a medical doctor who doesn't know basic anatomy, passing out prescriptions based on what the pharma sales babe said would work.
Learn microeconomics before graduating
Super quick review if you haven't taken any economics courses: econ is one of those fields that starts off with a bang, with many useful theories and facts that make sense, can be proven in the field, etc., and then it's all downhill from there. The useful bang at the beginning is microeconomics, which is the foundation for literally every theory in business that matters. After that things start to deteriorate: you get into Macroeconomics (feel free to skip this if you want) with its interesting theories about things like the relationship of interest rates to unemployment which, er, seem to be disproven more often than they are proven, and after that it just gets worse and worse and a lot of econ majors switch out to Physics, which gets them better Wall Street jobs, anyway. But make sure you take Microeconomics, because you have to know about supply and demand, you have to know about competitive advantage, and you have to understand NPVs and discounting and marginal utility before you'll have any idea why business works the way it does.
Why should CS majors learn econ? Because a programmer who understands the fundamentals of business is going to be a more valuable programmer, to a business, than a programmer who doesn't. That's all there is to it. I can't tell you how many times I've been frustrated by programmers with crazy ideas that make sense in code but don't make sense in capitalism. If you understand this stuff, you're a more valuable programmer, and you'll get rewarded for it, for reasons which you'll also learn in micro.
Don't blow off non-CS classes just because they're boring. Blowing off your non-CS courses is a great way to get a lower GPA.
Never underestimate how big a deal your GPA is. Lots and lots of recruiters and hiring managers, myself included, go straight to the GPA when they scan a resume, and we're not going to apologize for it. Why? Because the GPA, more than any other one number, reflects the sum of what dozens of professors over a long period of time in many different situations think about your work. SAT scores? Ha! That's one test over a few hours. The GPA reflects hundreds of papers and midterms and classroom participations over four years. Yeah, it's got its problems. There has been grade inflation over the years. Nothing about your GPA says whether you got that GPA taking easy classes in home economics at Podunk Community College or taking graduate level Quantum Mechanics at Caltech. Eventually, after I screen out all the 2.5 GPAs from Podunk Community, I'm going to ask for transcripts and recommendations. And then I'm going to look for consistently high grades, not just high grades in computer science.
Why should I, as an employer looking for software developers, care about what grade you got in European History? After all, history is boring. Oh, so, you're saying I should hire you because you don't work very hard when the work is boring? Well, there's boring stuff in programming, too. Every job has its boring moments. And I don't want to hire people that only want to do the fun stuff.
I took this course in college called Cultural Anthropology because I figured, what the heck, I need to learn something about anthropology, and this looked like an interesting survey course.
Interesting? Not even close! I had to read these incredibly monotonous books about Indians in the Brazilian rain forest and Trobriand Islanders, who, with all due respect, are not very interesting to me. At some point, the class was so incredibly wearisome that I longed for something more exciting, like watching grass grow. I had completely lost interest in the subject matter. Completely, and thoroughly. My eyes teared I was so tired of the endless discussions of piling up yams. I don't know why the Trobriand Islanders spend so much time piling up yams, I can't remember any more, it's incredibly boring, but It Was Going To Be On The Midterm, so I plowed through it. I eventually decided that Cultural Anthropology was going to be my Boredom Gauntlet: my personal obstacle course of tedium. If I could get an A in a class where the tests required me to learn all about potlatch blankets, I could handle anything, no matter how boring. The next time I accidentally get stuck in Lincoln Center sitting through all 18 hours of Wagner’s Ring Cycle, I could thank my studies of the Kwakiutl for making it seem pleasant by comparison.
I got an A. And if I could do it, you can do it.
Take programming-intensive courses.
I remember the exact moment I vowed never to go to graduate school.
It was in a course on Dynamic Logic, taught by the dynamic Lenore Zuck at Yale, one of the brightest of an array of very bright CS faculty.
Now, my murky recollections are not going to do proper credit to this field, but let me muddle through anyway. The idea of Formal Logic is that you prove things are true because other things are true. For example thanks to Formal Logic, "Everyone who gets good grades will get hired" plus "Johnny got good grades" allows you to discover the new true fact, "Johnny will get hired." It's all very quaint and it only takes ten seconds for a deconstructionist to totally tear apart everything useful in Formal Logic so you're left with something fun, but useless.
Now, dynamic logic is the same thing, with the addition of time. For example, "after you turn the light on, you can see your shoes" plus "The light went on in the past" implies "you can see your shoes."
Dynamic Logic is appealing to brilliant theoreticians like Professor Zuck because it holds up the hope that you might be able to formally prove things about computer programs, which could be very useful, if, for example, you could formally prove that the Mars Rover's flash card wouldn't overflow and cause itself to be rebooted again and again all day long when it's supposed to be driving around the red planet looking for Marvin the Martian.
So in the first day of that class, Dr. Zuck filled up two entire whiteboards and quite a lot of the wall next to the whiteboards proving that if you have a light switch, and the light was off, and you flip the switch, the light will then be on.
The proof was insanely complicated, and very error-prone. It was harder to prove that the proof was correct than to convince yourself of the fact that switching a light switch turns on the light. Indeed the multiple whiteboards of proof included many skipped steps, skipped because they were too tedious to go into formally. Many steps were reached using the long-cherished method of Proof by Induction, others by Proof by Reductio ad Absurdum, and still others using Proof by Graduate Student.
For our homework, we had to prove the converse: if the light was off, and it's on now, prove that you flipped it.
I tried, I really did.
I spent hours in the library trying.
After a couple of hours I found a mistake in Dr. Zuck's original proof which I was trying to emulate. Probably I copied it down wrong, but it made me realize something: if it takes three hours of filling up blackboards to prove something trivial, allowing hundreds of opportunities for mistakes to slip in, this mechanism would never be able to prove things that are interesting.
Not that that matters to dynamic logicians: they're not in it for useful, they're in it for tenure.
I dropped the class and vowed never to go to graduate school in Computer Science.
The moral of the story is that computer science is not the same as software development. If you're really really lucky, your school might have a decent software development curriculum, although, they might not, because elite schools think that teaching practical skills is better left to the technical-vocational institutes and the prison rehabilitation programs. You can learn mere programming anywhere. We are Yale University, and we Mold Future World Leaders. You think your $160,000 tuition entititles you to learn about while loops? What do you think this is, some fly-by-night Java seminar at the Airport Marriott? Pshaw.
The trouble is, we don't really have professional schools in software development, so if you want to be a programmer, you probably majored in Computer Science. Which is a fine subject to major in, but it's a different subject than software development.
If you're lucky, though, you can find lots of programming-intensive courses in the CS department, just like you can find lots of courses in the History department where you'll write enough to learn how to write. And those are the best classes to take. If you love programming, don't feel bad if you don't understand the point of those courses in lambda calculus or linear algebra where you never touch a computer. Look for the 400-level courses with Practicum in the name. This is an attempt to hide a useful (shudder) course from the Liberal Artsy Fartsy Administration by dolling it up with a Latin name.
Stop worrying about all the jobs going to India.
Well, OK, first of all, if you're already in India, you never really had to worry about this, so don't even start worrying about all the jobs going to India. They're wonderful jobs, enjoy them in good health.
But I keep hearing that enrollment in CS departments is dropping perilously, and one reason I hear for it is "students are afraid to go into a field where all the jobs are going to India." That's so wrong for so many reasons. First, trying to choose a career based on a current business fad is foolish. Second, programming is incredibly good training for all kinds of fabulously interesting jobs, such as business process engineering, even if every single programming job does go to India and China. Third, and trust me on this, there's still an incredible shortage of the really good programmers, here and in India. Yes, there are a bunch of out of work IT people making a lot of noise about how long they've been out of work, but you know what? At the risk of pissing them off, really good programmers do have jobs. Fourth, you got any better ideas? What are you going to do, major in History? Then you'll have no choice but to go to law school. And there's one thing I do know: 99% of working lawyers hate their jobs, hate every waking minute of it, and they're working 90 hour weeks, too. Like I said: if you love to program computers, count your blessings: you are in a very fortunate minority of people who can make a great living doing work they love.
Anyway, I don't think students really think about this. The drop in CS enrollment is merely a resumption of historically normal levels after a big bubble in enrollment caused by dotcom mania. That bubble consisted of people who didn't really like programming but thought the sexy high paid jobs and the chances to IPO at age 24 were to be found in the CS department. Those people, thankfully, are long gone.
No matter what you do, get a good summer internship.
Smart recruiters know that the people who love programming wrote a database for their dentist in 8th grade, and taught at computer camp for three summers before college, and built the content management system for the campus newspaper, and had summer internships at software companies. That's what they're looking for on your resume.
If you enjoy programming, the biggest mistake you can make is to take any kind of job--summer, part time, or otherwise--that is not a programming job. I know, every other 19-year-old wants to work in the mall folding shirts, but you have a skill that is incredibly valuable even when you're 19, and it's foolish to waste it folding shirts. By the time you graduate, you really should have a resume that lists a whole bunch of programming jobs. The A&F graduates are going to be working at Enterprise Rent-a-Car "helping people with their rental needs." (Except for Tom Welling. He plays Superman on TV.)
To make your life really easy, and to underscore just how completely self-serving this whole essay is, my company, Fog Creek Software, has summer internships in software development that look great on resumes. "You will most likely learn more about software coding, development, and business with Fog Creek Software than any other internship out there," says Ben, one of the interns from last summer, and not entirely because I sent a goon out to his dorm room to get him to say that. The application deadline is February 1st. Get on it.
If you follow my advice, you, too, may end up selling stock in Microsoft way too soon, turning down jobs at Google because you want your own office with a door, and other stupid life decisions, but they won't be my fault. I told you not to listen to me.
Work with me, here! Fog Creek Software has great paid internships in software development for qualified college students. They’re in New York City. Free housing, lunch, and more. And you get to work on real, shipping software with the smartest developers in the business. Next:Colo Expansion Version 2.0

Wednesday, April 15, 2009

Electricity's Spark of Life

Electricity's Spark of Life

Lots of kids get scared when their bedroom lights go out at night. When an entire city goes dark, many more people start to worry.
Government and utility officials are still scrambling to explain a blackout that hit much of the northeastern United States in late summer. From Detroit to New York, lights went out. Refrigerators, traffic signals, elevators, and subway trains stopped working. Computers went dead.
Without electricity, people had trouble getting to work, shopping for groceries, and communicating with each other. Normal life pretty much shut down for a few days.
Electricity also plays a crucial role within the human body. A lightning bolt or shock can disrupt or shut down that flow, causing disability or death.
"Electricity is life," says David Rhees, executive director of the Bakken Library and Museum in Minneapolis. The Bakken museum is dedicated entirely to the history and applications of electricity and magnetism in biology and medicine.
The museum has a lot to keep up with. As scientists learn more about the electrical signals that whiz through our bodies and the electrical pulses that tell our hearts to beat, they are finding new ways to use electricity to save lives.
Research on the nervous systems of animals and people are helping scientists design machines that help diagnose and treat brain conditions and other problems. New drugs are being developed to regulate the body's electrical pulses when things go wrong in response to injury or disease.
Electricity everywhere
Electricity is everywhere, thanks to the unique structure of the universe. Matter, which is basically everything you see and touch, is made up of tiny units called atoms. Atoms themselves are made up of even tinier parts called protons and neutrons, which form the atom's core, and electrons, which move around outside the core.
Protons have a positive electrical charge, and electrons have a negative electrical charge. Normally, an atom has an equal number of electrons and protons. The positive and negative charges cancel each other out, so the atom is neutral.
When an atom gains an extra electron, it becomes negatively charged. When an atom loses an electron, it becomes positively charged. When the conditions are right, such charge imbalances can generate a current of electrons. This flow of electrons (or electrically charged particles) is what we call electricity.
The first person to discover that electricity plays a role in animals was Luigi Galvani, who lived in Italy in the late 18th century. He found that electricity can cause a dissected frog's leg to twitch, showing a connection between electrical currents traveling along an animal's nerve and the action of muscles.
Quick signals
All animals that move have electricity in their bodies, says Rodolfo Llinas, a neuroscientist at New York University's School of Medicine. Everything we see, hear, and touch gets translated into electrical signals that travel between the brain and the body via special nerve cells called neurons.
Electricity is the only thing that's fast enough to carry the messages that make us who we are, Llinas says. "Our thoughts, our ability to move, see, dream, all of that is fundamentally driven and organized by electrical pulses," he says. "It's almost like what happens in a computer but far more beautiful and complicated."
By attaching wires to the outside of the body, doctors can monitor the electrical activity inside. One special machine records the heart's electrical activity to produce an electrocardiogram (EKG)—strings of squiggles that show what the heart is doing. Another machine produces a pattern of squiggles (called an EEG) that represents the electrical activity of neurons in the brain.
One of the newest technologies, called MEG, goes even further. It actually produces maps of magnetic fields caused by electrical activity in the brain, instead of just squiggles.
Recent observations of patterns of nerve-cell action have given scientists a much better view of how electricity works in the body, Llinas says. "The difference between now and 20 years ago is not even astronomical," he says. "It's galactic."
Now, researchers are looking for new ways to use electricity to help people with spinal injuries or disorders of the nervous system, such as Parkinson's disease, Alzheimer's disease, or epilepsy.
People with Parkinson's disease, for example, often end up having tremors and being unable to move. One type of treatment involves drugs that change the way nerve cells communicate with each other. As part of another new treatment, doctors put tiny wires on the head that send electrical impulses into the patient's brain. "As soon as you put that in," Llinas says, "the person can move again."
Philip Kennedy at Emory University in Atlanta has even invented a kind of "thought control" to help severely paralyzed people communicate with the outside world. His invention, called a neurotrophic electrode, is a hollow glass cone filled with wires and chemicals. With an implanted electrode, a patient who can't move at all can still control the movement of a cursor across a computer screen.
Looking to the past
One way to help keep the medical field speeding into the future might be to cultivate an appreciation for the past. At least, that's what the folks at the Bakken museum think.
When I recently visited the museum, Rhees and Kathleen Klehr, the museum's public relations manager, took me down to a huge padlocked room in the basement called "The Vault." Row upon row of shelves were crammed with rare, old books about electricity, early versions of pacemakers and hearing aids, and all sorts of weird devices. One was a shoe-store X-ray machine, powered by electricity, that showed you whether your foot fit comfortably into a new shoe.
Upstairs, the exhibits included a tank of electric fish and Hopi dolls dedicated to the spirit of lightning.
There's also a whole room dedicated to a monster made famous in a book titled Frankenstein. Made from assorted human parts, the monster was brought to life by an electrical spark. When Mary Shelley wrote Frankenstein in 1818, electricity was still a relatively new idea, and people were fascinated by the possibilities of what they might be able to do with it.
Even today, the Frankenstein room, with its scary multimedia presentation, remains one of the Bakken's most popular exhibits, Klehr told me. "It's been centuries," she says, "and everyone is still excited about Frankenstein."
That's something you might keep in mind the next time a blackout strikes. Without electricity, those monsters under your bed might have a lot less power over you!

Einstein's Skateboard

Einstein's Skateboard

Albert Einstein never rode a skateboard. Last month, however, skateboarders caught big air on a halfpipe in honor of the famous physicist.
In a field house on the campus of the University of Maryland in College Park, near Washington, D.C., a handful of skaters and a BMX biker took turns pulling tricks on the pipe. Off to the side, five middle-school students watched intently. With video cameras bearing down on them and a deadline approaching, the students worked frantically to solve the team's first problem of the day.

Our challenge," said David Westrich, 14, of Cape Girardeau, Mo., "is to find the points along the halfpipe where skaters experience the most and least gravitational force."
"Skateboard Physics" was one of six activities that eight teams tackled as part of this year's Discovery Channel Young Scientist Challenge (DCYSC). Every year, the competition brings the nation's top 40 middle-school science-fair winners to Washington, D.C., to battle for thousands of dollars in scholarship money, dream science trips, video cameras, and other prizes.
Finalists are judged individually on their ability to work as part of a team and to solve problems in clever ways. Students also need to be good at explaining their ideas. Swarms of video cameras follow contestants everywhere they go. When the week is over, the Discovery Channel produces a TV special about the event.
Star of the show
This year at DCYSC, although the students received plenty of attention, Albert Einstein was the star of the show.
Nearly 100 years ago, in one "miraculous year," Einstein wrote three research articles that changed the face of physics forever. In one article, he showed how light could be viewed as waves of radiation or as a beam made up of particles—little bundles of energy. In another, he introduced relativity theory and later showed that energy can be converted into matter and matter into energy. Finally, he explained how tiny particles get bounced around in a liquid.
To mark the 100th anniversary of Einstein's amazing work in 1905, the year 2005 has been designated the "World Year of Physics." In 2000, Time magazine had named Einstein "Person of the Century."

So, it's not surprising that Einstein became the theme of this year's DCYSC competition. "Einstein is the man," said challenge designer Steve "Judge Jake" Jacobs. "He's a legend."
Einstein's scientific achievements involved thinking in new ways. He spent his time doing "thought experiments" rather than mixing things together in a lab.
"Einstein was able to focus his thinking so perfectly," Judge Jake said. "Here, the kids have to focus their thinking to achieve their goals."
Judge Jake's hope was that finalists would go home with a sense of appreciation for how brilliant Einstein really had been and a drive to be just as successful. "I hope that, just for a moment, they'll think the way Einstein thought," he said.
Radar guns
Four out of six challenges this year had direct connections to physics and Einstein.
In the "Skateboard Challenge," teams had to figure out that the gravitational force was greatest as the skaters accelerated downhill and that it was smallest when the skaters reached the top, just before starting down again. They used a computer program and data from a high-definition video camera and a device called an accelerometer to test their predictions.
One of the other physics-related activities was called "Radar Gun Luge." In it, students checked out the notion that the speed of an object, as measured by a moving observer, depends on the observer's speed.
Finalists used radar guns to measure the speed of two luge carts. One cart held a wild-haired Einstein doll. The other carried a remote-controlled radar gun.

Team members had to drag the carts to the top of two steep tracks, then release the carts at the same time. Altogether, they had 90 minutes to confirm that the doll's measured speed was twice as fast when measured from the other cart moving toward it as when it was measured by an observer standing at the side.
If this sounds confusing to you, you're not alone. More than halfway through the activity, the red team was looking visibly shaken. "All of our data is useless," said Nicholas Ekladyous, 14, from Imlay City, Mich., with a hint of panic in his voice.
"We have no data," answered his teammate Rebecca Chan, 13, from Encinitas, Calif.

Discussion about what to do next rapidly turned sour. With just 10 minutes left, Nick was trying to persuade his teammates that they should abandon their original strategy and make a new plan. "Nick," Rebecca said, "it's going to take you longer to talk us out of this than for us to just do it."
After that temporary breakdown, it was on to the next challenge.

A New Basketball Gets Slick

A New Basketball Gets Slick

Basketball players need more than strength, speed, and skills to be on top of their game. Technology, too, can make the difference between a slam dunk and a stolen ball.
Now, technology and basketball seem to have collided, and some players are calling foul. At the center of the debate is a new type of ball introduced over the summer by the National Basketball Association. The NBA season began last week.

Basketballs used in NBA games have long had a leather cover. The new balls, however, are covered with a special kind of plastic. Spalding, the company that makes the new balls, insists that thorough tests during development showed that the synthetic covering performs better than leather does.
Experiments by scientists in Texas, however, seem to show otherwise. The researchers suggest that the plastic balls are less bouncy, more likely to bounce off course, and more slippery when moistened with sweat. These early experimental results suggest that this change in ball design could have a big effect on the quality of game play.
To compare friction, or the ball's ability to stick to surfaces (such as hands), the scientists took measurements as they slid both old and new balls against sheets of silicon. Silicon is similar to the palms of our hands in its degree of stickiness.
When dry, the old leather balls slid more easily than did the new plastic balls. When moistened with just one drop of a sweat-like liquid, however, the plastic balls became a lot more slippery than when they were dry.
Leather balls actually became stickier with sweat. And they absorbed moisture about eight times more quickly than the plastic balls did.
"When the balls are dry, the synthetic ball is easier to grip, and when they're wet, the leather one is much easier to grip," says physicist James L. Horwitz of the University of Texas-Arlington.
To keep professional players from dropping the ball, it may be necessary to change and clean balls throughout a game.
Some scientists are urging the NBA to reconsider the switch until scientists finish further testing.
John J. Fontanella, a former college basketball player and now a physicist at the United States Naval Academy in Annapolis, Md., belongs to that group. "The NBA," he says, "should stick with the leather basketball for another year."

Powering Ball Lightning

Powering Ball Lightning

Ball lightning is one of the strangest objects you might never see. The rare, basketball-sized fireballs occasionally form in nature after lightning strikes soil. They can float or bounce and last for several minutes before disappearing.
In recent years, scientists have learned something about the science behind ball lightning. But questions remain. A new study helps illuminate the picture.
Researchers at Tel Aviv University in Israel began the study after making ball lightning by mistake in their lab. Vladimir Dikhtyar and Eli Jerby had just invented a new type of drill that was made partly from pieces of microwave ovens.
The tip of the drill concentrates microwave radiation into a spot that measures just 2 millimeters wide. Such concentrated radiation allows the drill to pierce many materials.
About 10 years ago, Dikhtyar and Jerby were testing their new device when a glowing blob suddenly blew out of the material they were drilling. The blob eventually reentered the drill, causing a lot of damage.
Hoping to find out what had ruined their fancy tool, the engineers experimented until they could reliably make fireballs on purpose. The trick, they found, was to drill into glass.
They found a way to cage the glowing blobs for up to several minutes. To make the trap, they used a tissue-box-sized container with glass walls. They kept the glowing orbs alive by zapping them with extra microwaves.
The lab-made blobs were different from ball lightning that occurs in nature. For one thing, the artificial balls were much smaller—just a few centimeters across, instead of basketball-sized or bigger. They formed in a different way too. And if left alone, the manmade blobs vanished within 30 milliseconds. (There are 1,000 milliseconds in 1 second).
Still, the scientists thought their blobs were realistic enough to help test one of the leading theories about what causes ball lightning in nature.
In 2000, researchers from the University of Canterbury in Christchurch, New Zealand, proposed that ball lightning forms when lightning strikes soil. Under the right conditions, the strike creates a charged gas that glows and contains dust that is full of microscopic particles. Chemical reactions within the dust then create energy that keeps the gas glowing, the scientists suspected.
Using an intense X-ray beam, Dikhtyar and Jerby found evidence to support that theory. Their tests showed tiny particles within the artificial blobs. These particles were similar in size to the particles that may exist in natural ball lightning

A new look at Saturn's rings

A new look at Saturn's rings

Many students know that to figure out the age of a tree, you count the number of rings that make up its trunk, one ring for each year. But what if you wanted to know the age of the rings that surround the planet Saturn?
It's a tricky question that scientists have tried to answer for decades. In the late 1970s, the National Aeronautics and Space Administration, or NASA, sent a pair of spacecraft called Voyager 1 and Voyager 2 into outer space. Part of their mission was to fly past Saturn while taking pictures of and collecting data about the planet, then send all this information back to Earth.
Based on the data collected on those missions, scientists first estimated that the rings surrounding Saturn were only 100 million years old. Even though that sounds very old, 100 million years is actually quite young when compared with the solar system, which is 4.6 billion years old.
Looking at the physical characteristics of the particles that make up the rings is partly what helped astronomers determine the age. They reasoned that because the rings appear shiny and reflective, the particles in them, and the rings themselves, were fairly young. The scientists thought that the particles were young because they had not been around long enough for their surfaces to become darkened and less reflective. Things like dust and craters left from collisions with small meteorites can get particles dirty.
But a team of researchers in Colorado thinks Saturn's rings might be much older, closer to the age of the solar system itself. These researchers used a combination of computer simulations, which mimic events, and data from the Cassini spacecraft, which is currently orbiting Saturn and collecting data.
In the computer simulation, the team estimated the gravitational pull, a force that pulls objects together, between each of the particles making up the rings. Big particles in the rings may pull smaller particles to themselves, where they stick and make one larger particle. In their simulations, the researchers found that the particles making up Saturn's rings stick together in clumps and are not uniformly distributed, as previously thought.
The formation of new, larger ring particles from older, smaller ones could erase any surface darkening from previous collisions with meteorites, the researchers reasoned. They suggest the particles may look younger than they really are because they constantly clump together, possibly burying the cratered, dusty surface of the older particles beneath the surface of the new clumped particles.
Because of these clumped particles, scientists may have also underestimated the mass of the rings. Previously, astronomers calculated the mass of the rings by measuring how much starlight their particles blocked. The thinking was that the amount of blocked starlight could tell the amount of material in the rings. The more starlight was blocked, the more mass was present in the rings, the scientists reasoned.
But the older calculation assumed the particles were fairly evenly spread out in the rings. These newer data suggest the particles in the rings are clumped together with large empty spaces between them. In that arrangement, more light passes through than if the same mass of particles was spread evenly, as previously thought. This new understanding suggests Saturn's rings contain much more mass than scientists first estimated.
Taken together, the findings raise new questions about the estimated age of Saturn's rings, says Mark Lewis, a computer scientist at Trinity University in San Antonio, Texas. But until astronomers know more about what material the ring particles are made of, and details about how they clump together, the age of Saturn's rings will remain an astronomical puzzle.

Galaxies on the go

Galaxies on the go

Scientists have a mystery of cosmic proportions on their hands. Recently astronomers noticed something strange. It seems that millions of stars are racing at high speeds toward a single spot in the sky.
Huge collections of stars, gas and dust are called galaxies. Some galaxies congregate into groups of hundreds or thousands, called galaxy clusters. These clusters can be observed by the X-rays they give off.
Scientists are excited about the racing clusters because the cause of their movement can't be explained by any known means.
The discovery came about when scientists studied a group of 700 racing clusters. These clusters were carefully mapped in the early 1990s using data collected by an orbiting telescope. The telescope recorded X-rays created by electrons located in the hot core of a galaxy cluster.
The researchers then looked at the same 700 clusters on a map of what’s called the cosmic microwave background, or CMB. The CMB is radiation, a form of energy, leftover from the Big Bang. Scientists believe that the Big Bang marks the beginning of the universe, billions of years ago. The CMB provides a picture of how the early universe looked soon after the Big Bang.
By comparing information from the CMB to the map of galaxy clusters, scientists could measure the movement of the clusters. This is possible because a cluster’s movement causes a change in how bright the CMB appears.
As a galaxy cluster moves across the sky, the electrons from its hot core interact with radiation from the CMB. This interaction creates a change in the radiation’s frequency, or how often an event occurs in a certain amount of time. Scientists can then measure the frequencies to detect movement.
As a galaxy cluster moves toward Earth, the radiation frequency goes up. As a cluster moves away from Earth, the frequency goes down. This shift in the frequencies creates an effect similar to the Doppler effect.
The Doppler effect is commonly used to measure the speed of moving objects, such as cars. Scientists can use this method to measure the speed and direction of moving galaxies by looking at changes in the radiation frequencies.
What the scientists found surprised them. Though the frequency shifts were small, the clusters were moving across the sky at a high speed — about 1,000 kilometers per second. Even more surprising, the clusters were all moving in the same direction toward a single point in the sky.
Researchers don’t know what’s pulling this matter across the sky, but they are calling the source “dark flow.”
Whatever it is, scientists say the source likely lies outside the visible universe. That means it can’t be detected by ordinary means, such as telescopes.
One thing is certain. Dark flow has shown that we don’t understand everything we see in the universe and that there are still discoveries to be made.

Earth from the inside out

Earth from the inside out

Scientists have long known this strange fact: It’s easier to look deep into space than into the center of Earth. Light can pass through most of space, so the light from distant stars can easily be seen with the naked eye. But Earth is opaque, which means that light cannot pass through it.
If light cannot pass through it, then we cannot see what’s on the inside of our planet. So if we can’t use light to see inside our own planet, what can we use?
Recently, some scientists have been trying to use neutrinos — tiny particles smaller than an atom that zip through space. Neutrinos come from the sun or other distant stars, and astronomers have studied them for years. Now, a team of geoscientists — “geo” means Earth — think a kind of neutrino may have something to say about the Earth, too.
Not all neutrinos come from outer space. Special neutrinos called geoneutrinos are generated from within the Earth. (Remember that “geo” means Earth.) Most of these local neutrinos come from either the crust or the mantle. The crust is Earth’s outermost shell, what we stand on, and the mantle is five to 25 miles below the crust. Certain elements within the Earth can send off geoneutrinos when undergoing a process called radioactive decay.
During radioactive decay, a material loses some of its energy by sending out particles and radiation. An element that goes through this process is said to be radioactive, and radioactive elements occur naturally in the Earth. Some radioactive elements produce geoneutrinos.
After they are produced, geoneutrinos pass straight through the solid Earth without being absorbed or bouncing around. If they’re not stopped, they go straight into outer space — and keep going, and going and going. Geoscientists hope to catch a few of these particles on their way out, but it’s not going to be easy.
There are two big problems: There aren’t that many geoneutrinos, and they’re hard to find. To catch these elusive particles, scientists have designed special geoneutrino detectors. These strange-looking scientific instruments are giant, metal spheres buried deep underground.
In an abandoned mine in Canada, for example, scientists are preparing a geoneutrino detector that is four stories tall and more than a mile underground. The detector will be filled with a special liquid that flashes when a geoneutrino passes through. The liquid “produces a lot of light, and it’s very transparent,” says Mark Chen, the director of the project. When it’s up and running, probably in 2010, the detector will find only about 50 geoneutrinos per year. Other detectors are being planned all over Earth — one of them is even supposed to sit on the bottom of the ocean!
The geoscientists who study geoneutrinos hope that the particles will help answer an old question about the Earth. The interior of the Earth is blistering hot, but where does the heat come from? They know that part of the heat — maybe as much as 60 percent — comes from radioactive decay, but researchers want to know for sure. By measuring geoneutrinos, scientists hope to figure out how radioactive decay helps heat Earth.

Tuesday, April 7, 2009

Asteroid tracked from space to Earth

They saw it coming, and they got what was coming to them. For the first time, researchers not only detected an asteroid in space, but also tracked its progress and then collected its debris after it crashed to Earth.
The car-sized asteroid, dubbed 2008 TC3, landed in northern Sudan on October 7, 2008, scientists report in the March 25 Nature. The study combines for one asteroid data that are usually separate: Comparing data from observations of the asteroid in while it was space with analysis of its meteorite fragments on Earth will yield new insights into asteroids, the scientists say.
Small asteroids like 2008 TC3 are fairly common, with about one asteroid impacting Earth each year. But these small asteroids are usually not spotted until they enter the Earth’s atmosphere. “It’s like when bugs splatter on the windshield. You don’t see the bug until it’s too late,” says physicist and study coauthor Mark Boslough of Sandia National Laboratories in Albuquerque, N.M. Bigger asteroids are easier to spot but are much less common. “You’d see a baseball coming towards the windshield much sooner,” Boslough says. And it’s hard to detect the small asteroids because even powerful telescopes can only scan a small portion of the sky each night.
Scientists got lucky when they spotted 2008 TC3 using the Catalina Sky Survey telescope atop Mount Lemmon north of Tucson, Ariz. “It just so happened that the asteroid was coming from the direction that the telescope was pointed in,” says astronomer and study coauthor Peter Jenniskens of the SETI Institute in Mountain View, Calif.
As 2008 TC3 hurtled through space, researchers studied the spectra of sunlight reflected from its surface to get information about the asteroid’s mineral composition. The spectra showed that the asteroid was likely to come from the mysterious F-class of asteroids, a class only observed in space but not yet found as a meteorite on Earth..
Monitoring 2008 TC3’s progress, researchers correctly predicted that it would impact the Nubian Desert of northern Sudan about 19 hours after it was first spotted. Eyewitnesses reported seeing a fireball as the asteroid exploded over the desert.
Jenniskens and 45 students and staff from the University of Khartoum in Sudan searched for remnants along the asteroid’s projected path. The recovery team eventually found about 47 meteorites from 2008 TC3.
When the researchers got the meteorites back to the lab, they were in for a surprise. “The recovered meteorites were unlike anything in our collections up to that point,” Jenniskens says.
Studying the ratio of oxygen isotopes revealed that the meteorites were of the rare ureilite category. “This is the first time that ureilites were linked to F-class asteroids,” comments astronomer David Nesvorny of the Southwest Research Institute in Boulder, Colo. Researchers had previously thought ureilite meteorites came only from S-class asteroids.
Following 2008 TC3 also gave the researchers the opportunity to test their asteroid tracking devices. If a dangerously large asteroid was on a collision course with Earth, scientists would want to know that everything worked, Boslough says.

Ice cubes in space

You’d need a mighty tall glass to hold two space objects that researchers have now identified as ice cubes at the fringes of the solar system. The larger of the icy bodies is about the width of Ohio, the smaller about twice the length of Rhode Island. Both bodies are moons of the dwarf planet Haumea. The trio, discovered in late 2004 and 2005, reside in the Kuiper Belt, a reservoir of objects beyond the orbit of Neptune whose most famous denizen is Pluto.
Spectra taken of the larger and outermost of the two moons, dubbed Hi’iaka, had indicated that its surface, unlike most Kuiper Belt objects, is made of nearly pure crystalline water-ice. Now, new spectra, taken with the Hubble Space Telescope, not only confirm the composition of Hi’iaka, but for the first time also show that the surface of the smaller moon, Namaka, has the same composition. Because both moons are too small to have undergone heating and cooling that would have caused heavier elements to sink to the cores, the icy surfaces are likely to be fair representations of the moons’ interiors.
“These things could be, essentially, ice cubes,” says Michael Brown of the California Institute of Technology in Pasadena, a codiscover of Haumea and its moons. Brown and Caltech colleague Wesley Fraser describe the new observations online and in the April 10 Astrophysical Journal Letters.
The frozen findings aren’t just a cosmic curiosity. Haumea, whose rapid spin is thought to have reshaped it into a squashed football, is glazed with water-ice. (The dwarf planet’s interior, in contrast, is made up of much denser material.) The similarity between the surface of Haumea and its moons strongly suggests that these satellites were not Kuiper Belt residents that happened to be captured by Haumea, but were chipped off the surface of the dwarf planet as a result of some cataclysmic event.
Indeed, Haumea is the only Kuiper Belt object known to have a collisional family — chunks created when a large impactor, perhaps 500 kilometers in diameter, struck the dwarf planet in the distant past.
In Hawaiian mythology, Hi’iaka and Namaka are both daughters of Haumea, the goddess of fertility, and the new findings provide fresh evidence that these moons are indeed offspring of the dwarf planet, Brown says.
“At face value, it looks like Haumea’s collisional family and the moons are one and the same — the product of some extraordinary event” early in the history of the solar system, comments Daniel Fabrycky of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass.
In a separate study, Brown and Caltech colleague Darin Ragozzine used both Hubble and the Keck Observatory atop Hawaii’s Mauna Kea to track the motions of the two moons relative to Haumea. This detailed look at the moons’ orbits reveals that, as seen from Earth, Namaka and Haumea began transiting, or passing in front of each other, two years ago. The researchers posted their findings online March 26 , and the report will also appear in an upcoming Astronomical Journal.
Over the next few years, Namaka will journey across different sections of Haumea. The duration of each passage and the amount of light dimmed from Haumea will reveal the exact shape and size of the bodies, Ragozzine says.
A particularly rare and intriguing event will happen this July 2, he adds, when Namaka passes in front of Hi’iaka. Observations of this passage could reveal a wealth of new information about both moons

Sleep may clear the decks for next day’s learning

You snooze, you lose connections between brain cells, two new studies suggest.
People have known for some time that getting enough sleep is crucial for proper brain function. “If you don’t get enough sleep your ability to acquire, process and recall information is going to be impaired,” says Paul Shaw, a neuroscientist at Washington University in St. Louis and coauthor of one of the new studies.
But scientists debate exactly how sleep helps the brain learn and remember. Two studies appearing in the April 3 Science suggest that sleep weakens or severs connections between brain cells to make way for new information

study by Giorgio Gilestro, Giulio Tononi and Chiara Cirelli of the University of Wisconsin–Madison shows that proteins found in the connections between neurons, called synapses, build up in fruit fly brains while the flies are awake. Depriving flies of sleep leads to ever-greater levels of synaptic proteins, the researchers show. Levels of the proteins decrease as the flies sleep.
Scientists usually determine synapse strength by measuring electrical activity of neurons, but fruit fly brains are far too small for electrical measurements, Cirelli says. The proteins, she says, are markers of synaptic strength.
If true, the new finding would offer support for the theory of synaptic homeostasis, advanced by Tononi and Cirelli. The theory holds that sleep scales back the strength of connections between neurons, weakening the strongest connections and completely eliminating the weakest synapses. The cutbacks help save resources, the researchers say, and boost the signal of important memories over the noise of unneeded connections“We assume that if this is happening, it is a major function, if not the most important function, of sleep,” Cirelli says.
Other researchers have gathered conflicting data, though, suggesting that sleep aids in strengthening synapses, not weakening them.
Cirelli and Tononi’s new study is the first demonstration that sleep “does something” to synaptic connections, says Marcos Frank, a sleep researcher at the University of Pennsylvania in Philadelphia. But the technique is an indirect measure of the strength of synapses and doesn’t conclusively prove weakening of the connections. Some of his data indicate that sleep strengthens certain connections in kittens’ brains.
Cirelli says her group’s study addresses the function of sleep only in adult brains and it is too early to say whether sleep does different things to brains in young animals and people. She adds that even if isolated synapses gain strength during sleep, overall the brain loses connections during a snooze.
Frank isn’t the only skeptic of the synaptic homeostasis theory. Shaw has been studying how sleep affects learning and memory and set up his experiment, in part, to prove the idea wrong, he says. “I wasn’t buying it at the time, but the data are telling me I ought to.”
Work in Shaw’s lab previously showed that fruit flies need extra sleep after social interactions. In the new study, the researchers tracked the increased sleep drive to 16 neurons known as large lateral ventral neurons. These neurons are part of the circuitry that forms the circadian clock in the fruit fly brain. The team genetically engineered the flies to produce green fluorescent protein in neurons, making counting synapses easier. The study showed that the number of new synapses formed during social interactions decreased after flies slept. In contrast, sleep-deprived flies did not prune synapses, the researchers found. The data seems to support the theory since only sleeping flies lose synapses.
At first glance, losing synapses doesn’t seem like a good way to learn, Shaw admits. He speculates that sleep’s downsizing of the number of synapses keeps the flies’ brain circuits from getting overwhelmed by excitement and sensory input in social situations.
The new study also links learning and the need to snooze to three genes — rutabaga, period and blistered. Period was previously identified as a key gear in the fruit fly circadian clock. Blistered is the fruit fly–equivalent of human serum response factor, a gene involved in learning, memory and general brain rewiring.
Shaw says he’s not a complete convert to the synaptic homeostasis theory, though. He thinks sleep may weaken synapses in some circuits while it strengthens connections in other circuits.