Appendix16. Additional texts for writing abstracts, annotations and articles
- The Management Myth
By Matthew Stewart
Most of management theory is inane, writes our correspondent, the founder of a consulting firm. If you want to succeed in business, don’t get an M.B.A. Study philosophy instead.
During the seven years that I worked as a management consultant, I spent a lot of time trying to look older than I was. I became pretty good at furrowing my brow and putting on somber expressions. Those who saw through my disguise assumed I made up for my youth with a fabulous education in management. They were wrong about that. I don’t have an M.B.A. I have a doctoral degree in philosophy—nineteenth-century German philosophy, to be precise. Before I took a job telling managers of large corporations things that they arguably should have known already, my work experience was limited to part-time gigs tutoring surly undergraduates in the ways of Hegel and Nietzsche and to a handful of summer jobs, mostly in the less appetizing ends of the fast-food industry.
The strange thing about my utter lack of education in management was that it didn’t seem to matter. As a principal and founding partner of a consulting firm that eventually grew to 600 employees, I interviewed, hired, and worked alongside hundreds of business-school graduates, and the impression I formed of the M.B.A. experience was that it involved taking two years out of your life and going deeply into debt, all for the sake of learning how to keep a straight face while using phrases like “out-of-the-box thinking,” “win-win situation,” and “core competencies.” When it came to picking teammates, I generally held out higher hopes for those individuals who had used their university years to learn about something other than business administration.
After I left the consulting business, in a reversal of the usual order of things, I decided to check out the management literature. Partly, I wanted to “process” my own experience and find out what I had missed in skipping business school. Partly, I had a lot of time on my hands. As I plowed through tomes on competitive strategy, business process re-engineering, and the like, not once did I catch myself thinking, Damn! If only I had known this sooner! Instead, I found myself thinking things I never thought I’d think, like, I’d rather be reading Heidegger! It was a disturbing experience. It thickened the mystery around the question that had nagged me from the start of my business career: Why does management education exist?
Management theory came to life in 1899 with a simple question: “How many tons of pig iron bars can a worker load onto a rail car in the course of a working day?” The man behind this question was Frederick Winslow Taylor, the author of The Principles of Scientific Management and, by most accounts, the founding father of the whole management business.
Taylor was forty-three years old and on contract with the Bethlehem Steel Company when the pig iron question hit him. Staring out over an industrial yard that covered several square miles of the Pennsylvania landscape, he watched as laborers loaded ninety-two-pound bars onto rail cars. There were 80,000 tons’ worth of iron bars, which were to be carted off as fast as possible to meet new demand sparked by the Spanish-American War. Taylor narrowed his eyes: there was waste there, he was certain. After hastily reviewing the books at company headquarters, he estimated that the men were currently loading iron at the rate of twelve and a half tons per man per day.
Taylor stormed down to the yard with his assistants (“college men,” he called them) and rounded up a group of top-notch lifters (“first-class men”), who in this case happened to be ten “large, powerful Hungarians.” He offered to double the workers’ wages in exchange for their participation in an experiment. The Hungarians, eager to impress their apparent benefactor, put on a spirited show. Huffing up and down the rail car ramps, they loaded sixteen and a half tons in something under fourteen minutes. Taylor did the math: over a ten-hour day, it worked out to seventy-five tons per day per man. Naturally, he had to allow time for bathroom breaks, lunch, and rest periods, so he adjusted the figure approximately 40 percent downward. Henceforth, each laborer in the yard was assigned to load forty-seven and a half pig tons per day, with bonus pay for reaching the target and penalties for failing.
When the Hungarians realized that they were being asked to quadruple their previous daily workload, they howled and refused to work. So Taylor found a “high-priced man,” a lean Pennsylvania Dutchman whose intelligence he compared to that of an ox. Lured by the promise of a 60 percent increase in wages, from $1.15 to a whopping $1.85 a day, Taylor’s high-priced man loaded forty-five and three-quarters tons over the course of a grueling day—close enough, in Taylor’s mind, to count as the first victory for the methods of modern management.
Taylor went on to tackle the noble science of shoveling and a host of other topics of concern to his industrial clients. He declared that his new and unusual approach to solving business problems amounted to a “complete mental revolution.” Eventually, at the urging of his disciples, he called his method “scientific management.” Thus was born the idea that management is a science—a body of knowledge collected and nurtured by experts according to neutral, objective, and universal standards.
At the same moment was born the notion that management is a distinct function best handled by a distinct group of people—people characterized by a particular kind of education, way of speaking, and fashion sensibility. Taylor, who favored a manly kind of prose, expressed it best in passages like this:
… the science of handling pig iron is so great and amounts to so much that it is impossible for the man who is best suited to this type of work to understand the principles of this science, or even to work in accordance with these principles, without the aid of a man better educated than he is.
From a metaphysical perspective, one could say that Taylor was a “dualist”: there is brain, there is brawn, and the two, he believed, very rarely meet.Taylor went around the country repeating his pig iron story and other tales from his days in the yard, and these narratives formed something like a set of scriptures for a new and highly motivated cult of management experts. This vanguard ultimately vaulted into the citadel of the Establishment with the creation of business schools. In the spring of 1908, Taylor met with several Harvard professors, and later that year Harvard opened the first graduate school in the country to offer a master’s degree in business. It based its first-year curriculum on Taylor’s scientific management. From 1909 to 1914, Taylor visited Cambridge every winter to deliver a series of lectures—inspirational discourses marred only by the habit he’d picked up on the shop floor of swearing at inappropriate moments.
Yet even as Taylor’s idea of management began to catch on, a number of flaws in his approach were evident. The first thing many observers noted about scientific management was that there was almost no science to it. The most significant variable in Taylor’s pig iron calculation was the 40 percent “adjustment” he made in extrapolating from a fourteen-minute sample to a full workday. Why time a bunch of Hungarians down to the second if you’re going to daub the results with such a great blob of fudge? When he was grilled before Congress on the matter, Taylor casually mentioned that in other experiments these “adjustments” ranged from 20 percent to 225 percent. He defended these unsightly “wags” (wild-ass guesses, in M.B.A.-speak) as the product of his “judgment” and “experience”—but, of course, the whole point of scientific management was to eliminate the reliance on such inscrutable variables.
One of the distinguishing features of anything that aspires to the name of science is the reproducibility of experimental results. Yet Taylor never published the data on which his pig iron or other conclusions were based. When Carl Barth, one of his devotees, took over the work at Bethlehem Steel, he found Taylor’s data to be unusable. Another, even more fundamental feature of science—here I invoke the ghost of Karl Popper—is that it must produce falsifiable propositions. Insofar as Taylor limited his concern to prosaic activities such as lifting bars onto rail cars, he did produce propositions that were falsifiable—and, indeed, were often falsified. But whenever he raised his sights to management in general, he seemed capable only of soaring platitudes. At the end of the day his “method” amounted to a set of exhortations: Think harder! Work smarter! Buy a stopwatch!
The trouble with such claims isn’t that they are all wrong. It’s that they are too true. When a congressman asked him if his methods were open to misuse, Taylor replied, No.
If management has the right state of mind, his methods will always lead to the correct result. Unfortunately, Taylor was right about that. Taylorism, like much of management theory to come, is at its core a collection of quasi-religious dicta on the virtue of being good at what you do, ensconced in a protective bubble of parables (otherwise known as case studies).
Curiously, Taylor and his college men often appeared to float free from the kind of accountability that they demanded from everybody else. Others might have been asked, for example: Did Bethlehem’s profits increase as a result of their work? Taylor, however, rarely addressed the question head-on. With good reason. Bethlehem fired him in 1901 and threw out his various systems. Yet this evident vacuum of concrete results did not stop Taylor from repeating his parables as he preached the doctrine of efficiency to countless audiences across the country.
The Management Myth http://www.theatlantic.com/magazine/archive/2006/06/the-management-myth/4883/5/?single_page=true
2. «Antarctic Mission to Look for Life in Sub-Glacial Lake»
Researchers from the British Antarctic Survey will drill 3km through ice sheet to take samples from Lake Ellsworth
It is a mission into the uncharted and unknown in search of the hardiest life forms on Earth. British engineers set off last week to explore a lake that has been isolated from the rest of the planet for hundreds of thousands of years three kilometers under the Antarctic ice.
Researchers from the British Antarctic Survey (BAS) will use a hot-water "drill" to cut through the ice cap to Lake Ellsworth, on the western Antarctic ice sheet. By sampling the contents of the lake, which is liquid because of the extreme pressure of the ice on top of it, they hope to find clues about the evolution of life.
The predominant mood among the scientists is one of intense curiosity. "We really don't know what to expect," said Martin Siegert of the University of Edinburgh, one of the principal investigators on the expedition. "Whether we will find lots of life, whether we'll find low levels of life on the edge of existence, or whether we'll find nothing."
In recent decades, scientists have found bacteria and other single-celled organisms that have evolved to live in conditions in which other life forms would struggle to survive, such as darkness or extreme temperatures or salinity. The scientists believe that Lake Ellsworth might be a haven for these so-called "extremophiles".
"There is [also] a chance that viruses might well be there, bacteria might well be there and other more complex forms, but we don't believe other macro-organisms are down there," said Siegert.
David Pearce, science coordinator at the BAS and part of the team that will make the measurements next year once the equipment is in place, said finding life in a lake that had been isolated from the rest of the biosphere for so long would reveal much about life on Earth, but "if we find nothing, this will be even more significant, because it will define limits at which life can no longer exist on the planet".
Whatever is found, it will shed light on the potential of life existing elsewhere in our solar system. Europa, one of the moons of Jupiter, has an icy crust with a liquid ocean underneath, and some astrobiologists think that life might be able to survive there. "If life is teeming in Lake Ellsworth, then we know it's a very good habitat and it might change our appreciation of other places, Europa included," said Siegert.
Lake Ellsworth will be the first of Antarctica's 387 known sub-glacial lakes to be sampled directly. "We don't know whether the lake is 100,000 years old, 400,000 years old or a million years old or older," said Siegert. "These are questions we need answers to."
All the equipment that will be sent into the lake will be pre-sterilised and bagged in clean rooms and laboratories in the UK. They will only be unsealed for use when they are in the boreholes – the samples of lake water, for example, will be brought up to the surface in pressurised titanium cylinders to preserve their contents.
The team will have 24 hours to take all they need before the bitter cold causes the water in the borehole to freeze solid and seal the lake once more.The engineers who will install the equipment left for the Antarctic on Friday, taking with them around 70 tonnes of equipment. In a year, the science team will follow and spend four days drilling and taking their samples.
"The detailed analyses will take place in the UK in the following months and it'll probably be, at the earliest, around Easter time [in 2013] before we would be prepared to tell everybody what's in there," said Siegert.
Antarctic Mission to Look for Life in Sub-Glacial Lake
http://www.guardian.co.uk/world/2011/oct/15/antarctic-mission-sub-glacial-lake#
The History of the Laser
It is one of the best examples of how technology can go from the science of the future to everyday use in a short period of time. Laser is short for Light Amplification by Stimulated Emission of Radiation. The idea behind lasers is complex. Just how complex? Consider that it took the mind of Albert Einstein to discover the physics behind the laser. Theodore Maiman succeed in building the first working laser in nineteen sixty. Mr. Maiman worked at Hughes Research Laboratories in Malibu, California.
A laser fires a light beam. Before the laser, scientists developed a similar device: a maser which stands for Microwave Amplification by Stimulated Emission of Radiation. A maser is basically a microwave version of the laser. Microwaves are a form of electromagnetic radiation similar to, but shorter than, radio waves. The best-known use of masers is in highly accurate clocks. In the nineteen fifties, researchers in the United States and Russia independently developed the technology that made both masers and lasers possible. Charles Townes was a professor at the Massachusetts Institute of Technology in Cambridge, Massachusetts. He and his students developed the first maser. Russians Nicolay Basov and Aleksandr Prokhorov did their research in Moscow. Their work led to technology important to lasers and masers. The three men received the Nobel Prize in Physics in nineteen sixty-four.
The idea of a thin beam of light with deadly power came much earlier. By the end of the eighteen hundreds, the industrial revolution had shown that science could invent machines with almost magical powers. And some writers of the time were the first to imagine something like a laser. In eighteen ninety-eighty, H.G. Wells published a science fiction novel called “The War of the Worlds.” In it, he described creatures from the planet Mars that had technology far beyond anything on Earth. Among their weapons was what Wells called a “heat ray.” Laser light is different from daylight or electric lights. It has one wavelength or color. Laser light is also highly organized. Light behaves like a wave and laser light launches in one orderly wave at a time from its source.
The physics of the laser may be complex. Still, it is just a story of how electrons interact with light. When a light particle, or photon, hits an electron, the electron jumps to a higher energy state. If another photon strikes one of these high-energy electrons, the electron releases two photons that travel together at the same wavelength. When this process is repeated enough, lots of organized, or coherent, photons are produced. Industry put lasers to work almost immediately after they were invented in nineteen sixty. But weapons were not first on the list. The first medical operation using a laser took place the year following its invention. Doctors Charles Campbell and Charles Koester used a laser to remove a tumor from a patient’s eye at Columbia-Presbyterian Hospital in New York City. Since then, doctors have used lasers to cut and remove tissue safely with little risk of infections. Other health uses include medical imaging and vision correction surgery. Eye surgeons use lasers in LASIK operations to reshape the cornea, which covers the lens of the eye. The reshaped cornea corrects the patient’s bad eyesight so he or she does not have to wear glasses or other corrective lenses.
Lasers have made measurement an exact science. Astronomers have used lasers to measure the moon’s distance from Earth to within a few centimeters. Mappers and builders use laser technology every day. For example, drawing a perfectly level straight line on a construction site is easy using a laser. Energy researchers are using lasers in an attempt to develop fusion, the same energy process that powers the sun. Scientists hope fusion can supply almost limitless amounts of clean energy in the future. Manufacturers have used lasers for years to cut and join metal parts. And the jewelry industry uses lasers to write on the surface of the world’s hardest substance, diamonds. Laser barcode scanners have changed how stores record almost everything. They help businesses keep track of products. They help in storage and every detail of the supply process.
Lasers are found in many products used almost everywhere. Laser printers can print out forms and documents quickly and are relatively low in cost. They are required equipment for offices around the world. If you have a CD or DVD player, you own a laser. Laser disc players use lasers to accurately read or write marks on a reflective, coated plastic disc. A device turns these optical signals into digital information that becomes music, computer software or a full-length movie.
Over one hundred years ago, writers imagined that beams of light could be powerful weapons. Today, lasers guide missiles and bombs. For example, pilots can mark a target invisibly with a laser. Bombs or missiles then track the target with deadly results. And, yes, American defense companies are working on giant laser guns recognizable to science fiction fans everywhere. But there are technological difficulties. Scientific American magazine says huge lasers turn only about twenty to thirty percent of the energy they use into a laser beam. The rest is lost as heat.
That has not stopped scientists from working to perfect powerful lasers that, one day, may be able to shoot missiles out of the sky.
http://learningenglish.voanews.com/content/the-history-of-the-laser/1597831.html
Hadron Collider
The Large Hadron Collider (LHC) is the world's largest and highest-energy particle accelerator. It was built by the European Organization for Nuclear Research (CERN) from 1998 to 2008, with the aim of allowing physicists to test the predictions of different theories of particle physics and high-energy physics, and particularly prove or disprove the existence of the hypothesized Higgs boson and of the large family of new particles predicted by supersymmetric theories. The LHC is expected to address some of the still unsolved questions of physics, advancing human understanding of Physical laws. It contains six detectors each designed for specific kinds of exploration.
The LHC was built in collaboration with over 10,000 scientists and engineers from over 100 countries, as well as hundreds of universities and laboratories. It lies in a tunnel 27 kilometres (17 mi) in circumference, as deep as 175 metres (574 ft) beneath theFranco-Swiss border near Geneva, Switzerland.
The term hadron refers to composite particles composed of quarks held together by the strong force (as atoms and molecules are held together by the electromagnetic force). The best-known hadrons are protons and neutrons; hadrons also include mesons such as thepion and kaon, which were discovered during cosmic ray experiments in the late 1940s and early 1950s.
A collider is a type of a particle accelerator with directed beams of elementary particles. In particle physics colliders are used as a research tool: they accelerate particles to very high kinetic energies and let them impact other particles. Analysis of the byproducts of these collisions gives scientists good evidence of the structure of the subatomic world and the laws of nature governing it. Many of these byproducts are produced only by high energy collisions, and they decay after very short periods of time. Thus many of them are hard or impossible to study in other ways.