МАТЕРИАЛЫ ДЛЯ РАБОТЫ В АУДИТОРИИ. Проверьте, знаете ли вы следующие слова.

(ЗАНЯТИЕ ПЕРВОЕ)

Проверьте, знаете ли вы следующие слова.

1) collect v, scene n, extensive a, intelligence n, extraordinary a, linear a, instruction n, matrices n, command n, accuracy n, total a, detect v, correct a, programming n, variety n

2) mean v, means n, design n, tool n, solve v, purpose n, appli­cable a, task n, consider v, state v, tremendously adv, simulate v, perform v, significant a, attempt n, brain n, surely adv, similarity n, exist v, velocity n, manufacture v, expensive a, influence n, include v, compare v, virtual a, accept v, demand v, require v, level n, merely adv, enable v, improve v, recently adv , lead v, fill v, capabil­ity n, previously adv, point v, enhance v, availability n, expect v, particular a, need v

Ознакомьтесь с терминами Основного текста.

1. arithmetic problems - арифметические задачи

2. to digest and analyse measurements - обобщать и ана­лизировать измерения

3. memory-size — объем памяти

4. secondary storage — вспомогательное ЗУ

5. central processing unit – ЦПУ

6. to feed —вводить

7. to take out —выводить

8. to issue commands — задавать команды

9. assembler language — ассемблер

10. running time — время работы

11. bit-map — схема распределения

12. a pointing device — указатель

13. communication modalities — способы предъявления

14. database processing tools – средства обработки базы данных

15. distributed-processing system — система обработки с распределением

ОСНОВНОЙ ТЕКСТ

1. Переведите первую часть (I) Основного текста в ауди­тории устно под руководством преподавателя.

2. Просмотрите вторую часть (П) Основного текста и кратко изложите ее содержание по-русски.

COMPUTER AS IT IS

I. The word "computer" comes from a Latin word which means to count.1 A computer is reallу a very special kind of counting machine.

Initially,2 the computer was designed as a tool to manipulate numbers and thus solve arithmetic problems. Although designed originally3 for arithmetic purposes at present it is applicable for a great variety of tasks.

Nowadays computers are considered to be complicated4 machines for doing arithmetic and logic. The computer may be stated to have become an important and powerful tool for collect­ing, recording,5 analysing, and distributing6 tremendous masses of information.

Viewed7 in the contemporary8 scene and historical per­spective the computer simulates man. Indeed,9 two important and highly visible characteristics of man are his intelligence and his ability to perform in and control his environment.10

Significantly, man's attempts to understand the phenomena of intelligence, control and power has led to simulations of his brain, of himself and of organizational and group structures in which he most often finds himself. In the last 30 years man has made exten­sive use of the computer for these simulations.

Surely, there are similarities with human brain, but there ex­ists one very important difference. Despite11 all its ac­complishments,12 the so-called electronic brain must be pro­grammed by a human brain.

As already stated, originally computers were used only for doing calculations.

Today it would be difficult to find any task that calls for13 the processing of large amounts14 of information that is not per­formed by a computer. In science computers digest and analyse masses of measurements, such as the sequential15 positions and velocities of a spacecraft and solve extraordinary long and complex mathematical problems, such as the trajectory of the spacecraft. In commerce16 they record and process inventories, purchases (по­купка), bills, payrolls (платежная ведомость), bank deposits and the like and keep track of ongoing business transactions.17 In industry they monitor18 and control manufacturing processes. In government they keep statistics and analyse economic infor­mation.

A computer system can perform millions of operations a sec­ond. In the mid-1950's the average19 speed of main-memory20 was about 10 ms, in the mid-1960's 1 ms, in the mid-1970's a tenth to a hundredth of a microsecond and in the mid-1980's it largely increased.

The computer's role is influenced not only by its speed but also by its memory-size. A large memory makes it easier to work with large programs, including data (compare linear programming or regression analysis requiring large matrices).

The increase in main memory capacity has been spectacular21 too: mid-1950's 100 thousand bits, mid-1960's 1 to 10 million, mid-1970's nearly 1 billion bits. Secondary storage22 has been greatly expanded by the use of discs. Primary and secondary storage have been integrated by the virtual memory technique.

Although accepted for different purposes computers virtually do not differ in structure.

Any computer is, architecturally, like any other computer. Regardless23 of their size or purpose most computer systems consist of three basic elements: the input-output ports,24 the memory hierarchy and the central processing unit. The input-out­put ports are paths whereby25 information (instructions and data) is fed26 into the computer or taken out of it by such means as punch cards,27 magnetic tages and terminals. The memory hierarchy stores the instructions (the program) and the data in the system so that they can be retrieved28 quickly on deinjpd by the central processing unit. The central processing unit controls the operation of the entire29 system by issuing30 commands to other parts of the system and by acting on the responses. When required it reads31 information from mememory, interprets32 instructions, performs operations on the data according to the instructions, writes the results back into the memory and moves information between memory levels or through the input-output ports. The operations it performs on the data can be either arithmetic or logical.

As stated above any computer is, architecturally, like any other computer in the early days of computers. However, there are differences. They are the following: An early processor used to be made of thousands of vacuum tubes. Reliability was measured in mere hours between failures, and the cooling plant was often larger than the computer itself. Then, the transistor was invented. The number of them was enormous in each mainframe. Besides, in computers of the 1950's, the transistors, diodes, resistors, ca­pacitors and other components were mounted33 on printed-circuit (PC) cards. A typical 5-in. card contained a dozen transistors and a hundred other parts. A card might have contained a single flip-flop34 and a thousand cards were required to build each computer.

In the early 1960's semiconductors makers created a wholly new technology, a whole flip-flop could be integrated. Several of integrated circuits (ICs) could be mounted on a single printed card.35 Soon, improved fabrication processes enabled even more complex circuit to be created in a single IC. The new technology was called medium-scale integration (MSI), and the older tech­nology was labelled36 small-scale integration (SSI). The progress towards smaller computers continued.

If used for computers discrete transistors were too costly and unreliable, they were too large and too slow.

In the 1960's advances in microelectronic components led to the development of the minicomputer, followed more recently by an even smaller microcomputer. Both have filled a need for small but relatively flexible37 processing systems able to execute38 comparatively simple computing functions at lower cost.

In 1971, Intel Corp. delivered the first microprocessor, the 4004. All the logic to implement39 the CPU, the central processing unit, of a tiny computer was put onto a single silicon chip less than 1/4-in square. That design was soon followed by many others. The progress toward smaller computers is likely to continue: there is already talk of nano-computers and pico-computers.

When the central processing unit (CPU) of a computer is im­plemented in a single or very small number of integrated circuits, we call it a microprocessor. When a computer incorporates40 a microprocessor as its major component, the resulting configura­tion is called a microcomputer. When the entire computer, in­cluding CPU, memory and input-output capability, is incorporated into a single IC, we also call that configuration a microcomputer. To distinguish41 between the two microprocessor types, we call the latter a one-chip microcomputer.

Modern computers and microelectronic devices have in­teracted so closely in their evolution that they can be regarded as virtually symbiotic. Microelectronics and data processing are linked.42

Today the hardware in data-processing machines is built out of microelectronic devices. Advances in microelectronic devices give rise to advances in data-processing machinery.

As previously pointed computers today are providing an ex­panding range of services to rapidly growing pool (количество) of users. Such facilities43 could make our lives easier, and indeed they already enhance the productivity. Yet a bottleneck (труд­ность) remains which hinders44 the wider availability of such systems; this bottleneck is the man-machine communication barrier.

Simply put, todays systems are not very good at com­municating with their users. They often fail45, to understand what their users want them to do and then are unable to explain the nature of the misunderstanding to the user. Communication with the machines is sometimes time-consuming.46 What are the causes of this communication barrier?

One of the most important causes of the man-machine com­munication barrier is that an interactive computer system typically responds only to commands phrased with total accuracy in a highly restricted47 artificial48 language designed specifically for that system. If a user fails to use this language or makes a mistake, however small, an error49 message50 is the response he can ex­pect.

II. Several developments have helped to reduce programming effort. High-level languages like FORTRAN, ALGOL, PL-1, and COBOL have replaced assembler languages to a great extent. There is a trend51 towards languages with a free format and more error checking.52 Thus programming itself takes less time since fewer errors are made and residual53 errors are detected and corrected more rapidly. ADA seems destined54 to become the dominant programming language of 1980's. The term "ADA" comes from the name of Byron's daughter Ada, Lady Lovelace. She was the first programmer in the world.

These high-level languages, however, require more com­pilation and running time, and more memory space.

Currently,55 almost all man-machine interaction takes place through typed input and output. Superficially,56 at least, it is this mode, that human communication needs.

However, this type of man-machine communication is rapidly becoming outmoded57 by a generation of powerful personal computers. These machines are intended58 for dedicated use by a single individual and feature an integral high-resolution, bit-map, graphics display with a pointing device, as well as a conventional keyboard. This allows the computers to provide multiple in­dependent output channels. Besides extra communication chan­nels, such machines provide for different communication modali­ties: a graphics screen59 can display line drawings or images60 and produce attention-commanding effects such as highlighting (вы­свечивать) or flashing the background61 of certain areas of the screen.

The multiple communication channels and modalities allow for more effective interaction.

Recent Computer technology advances are the following: Voice annotations Facsimile images, High-drawn sketches, Animated sequences. The potential advantages of multimedia communications technology are too great to ignore.

Many scientists are conducting a research on man-machine communication. The work is ongoing. Of particular interest are information systems that model complex real-world events.

Active information systems are database processing tools in­tended to represent and manipulate data descriptions of large real-world systems that have a complex dynamic behaviour.62

It is apparent63 that if the language of recipient and sender differs, the data of the message cannot be used. Problems in un­derstanding the content must be resolved by cooperation between the sender and the recipient.

In automated information systems the computers must re­ceive and at the same time interpret and act on the data. In infor­mation systems, to be more explicit, the fields of computers and communications are merging.64

In this case data reliability is a significant design factor. More and more data are stored in machines without paper or manual backup.65 That data must be accurate, protected, and available.

Besides computers and information systems are becoming more distributed. At the same time the integration and co­ordination of the individual information systems and computers in an organization are becoming more of necessity. This introduces new requirements, design parameters, and tradeoffs.66

These considerations affect system issues ranging from the ar­chitecture of specific computers to the architecture of overall in­formation systems.

To sum up, computers have certain disadvantages. We have not given them those common-sense skills67 of interaction and communication that people find so natural and effortless. Nevertheless computers are fast enough to permit man to control mechanisms having rates of response exceeding his own reaction time.

The computer has made it possible to mechanize much of the information interchange and processing that constitute68 the nervous system of our society.

The versatility,69 and convenience70 of the microprocessor has altered the entire architecture of modern computer systems. No longer is the processing of information carried out only in the computer's central processing unit. Today there is a trend toward distributing more processing capability throughout a computer system, with various areas having smafi local processors for han­dling operations in those areas.

There are a number of advantages to distributed processing. First, since many elements of the computer can be working on different portions of the same task, the work may be done faster. Second, if one element in the network malfunctions, its work­load71 can be shifted to another element or shared among several elements, sothat the entire work is relatively immune to failure. Third, the network can be small enough to be contained within a single laboratory or building, or it can be spread out over a wide area.

A major obstacle72 to designing an effective distributed-processing system is the difficulty involved in writing the system's software, which must enable the various elements of the network to operate and interact efficiently.

The method of processing data as well as available peripheral devices define computer generations.73 We are now operating third and fourth generation computers and looking ahead to the fifth. An advantage of the fifth generation will be the ability of people without knowledge of programming to use computer ter­minals. Remote74 processing will be common too.

Наши рекомендации