Архитектура Аудит Военная наука Иностранные языки Медицина Металлургия Метрология Образование Политология Производство Психология Стандартизация Технологии |
COMPUTER HISTORY AND APPLICATIONСтр 1 из 9Следующая ⇒
COMPUTER HISTORY AND APPLICATION 1 In groups or with a partner discuss the following questions. · Are you an advanced computer user? · How much time a day do you spend in front of the screen? · When did you learn how to use the computer? · How important is the computer in your studies and in your daily life? 2Say what devices can be used to do the following actions. Is there anything in the list that can be done only with the help of computer? To type and print documents, to play music, to send messages, to surf the net, to process data, to download and store information, to make graphs. Reconstruct the jumbled sentences. Do you think these statements are true? Add your own definition of computer.
4 You will read an article about personal computer from an on-line encyclopedia. Before you read suggest your answers to these questions. Then look quickly through the text and check you answers.
Read the article, check you answers for the previous activity. Fill in the gaps (1 – 6) with phrases (A – G). One phrase needn’t be used.
A competing operating systems B capable of running the same software C to put an entire CPU on one chip D to form network s E to design electrical installations and lighting systems F single-user systems G computing power and graphics capability
PERSONAL COMPUTER Personal computer is a small, relatively inexpensive computer designed for an individual user. In price, personal computers range anywhere from a few hundred dollars to thousands of dollars. All are based on the microprocessor technology that enables manufacturers 1________________________________. CPU stands for Central Processing Unit. The reason the CPU is called a processor is because it can work with data: it can do calculations; it can move data. Businesses use personal computers for word processing, accounting, desktop publishing, and for running spreadsheet and database management applications. At home, the most popular use for personal computers is for playing games. Personal computers first appeared in the late 1970s. One of the first and most popular personal computers was the Apple II, introduced in 1977 by AppleComputer. During the late 1970s and early 1980s, new models and 2__________________________________________________ appeared daily. Then, in 1981, IBM entered the market with its first personal computer, known as the IBM PC. The IBM PC quickly became the personal computer of choice, and most other personal computer manufacturers fell by the wayside. One of the few companies to survive IBM was Apple Computer, which remains a major player in the personal computer marketplace. Other companies adjusted to IBM's dominance by building IBM clones, computers that were internally almost the same as the IBM PC, but that cost less. Because IBM clones used the same microprocessors as IBM PCs, they were 3__________________________________________. Over the years, IBM has lost much of its influence in directing the evolution of PCs. Today, the world of personal computers is basically divided between AppleMacintoshes and PCs. The principal characteristics of personal computers are that they are 4_________________________________________ and are based on microprocessors. However, although personal computers are designed as single-user systems, it is common to link them together 5___________________. In terms of power, there is great variety. At the high end, the distinction between personal computers and workstations has faded. High-end models of the Macintosh and PC offer the same 6___________________________as low-end workstations by Sun Microsystems, Hewlett-Packard, and DEC.
Picture 1
Picture 2
BABY GROWS UP By Richard Turner In 1965, The University of Manchester was the first in the country to open a Computer Science degree course – and Linda Brackenbury was one of the first 28 students to enroll. Linda - now a Senior Lecturer in the department - was taught by Tom Kilburn and Sir Freddie Williams who invented the world's first computer - 'The Baby' in 1948. 'Baby' had the equivalent processing power of a mobile phone but filled an entire room with technical apparatus and cables. A replica is now based within Manchester’s Museum of Science and Industry. Linda looks back at those pioneering days: 0 What do you remember about your days as a student in Manchester? A: " It was an extremely exciting time to be an undergraduate in Computer Science (CS). It was really the first course in the country and we were the first lot to be admitted to that course. There were only 28 of us in the first year – and just four girls - and we had a very close relationship with the staff because we were pioneering our way through this course and there was this buzz and excitement about what we were doing." " Nobody had a crystal ballas to where it was actually going to go but to me it looked very interesting because I had studied Maths and Physics at advanced level and I wanted something that was going to continue both streams of knowledge. And CS where we were going to learn something about the hardware and use our maths to do the programming seemed to be the ideal combination."
" Turing is considered by most people to be one of the original thinkers of the last century and I think he’s had a tremendous impact. You can’t pick up any textbook which refers to early computing without the name of Turing being mentioned and he really was a one-off sort of guy. We’re very proud and that’s why we’re hoping to celebrate in great style on the 40th anniversary." " Well, the first technology of 1948 was valves. And it was steadily moving into the transistor era and the first integrated circuits were just coming out around then. It was a time of real technological change going from valves into much smaller units and circuits so that machines could be built in a much smaller space." " That was the Atlas machine and again that was a transistor machine which was fairly advanced at the time and it had a lot of very sophisticated facilities compared with the 1948 machine which was really only the power of your average small hand held computer." " If you imagine a very large lounge-cum-dining room and you imagine that were down both sides of the walls down the long sides of the walls. And looms of cables strung across the two cabinets down the sides between them, you’ll get the sort of feeling of how big the whole thing was." " It’s a replica of the world’s first computer – it’s a rebuild – a very good rebuild and the person who’s done the rebuilding Chris Burton has had to go right round the country to find all the bits and pieces." " Although the technology has changed a great deal, the underlying principles are very much as they always were. Of course, things are lot more sophisticated now and we’re able to do more things but when you look at it… it all stems from the 1948 machine. I hope that the message that they take away is that we’ve come a long away and there’s still a long way to go. And that it’s an exciting topic to be in." UNIT2 COMPUTER STRUCTURE
HOW TO READ THE COMPUTER AD
1 The main processing chip that operates at a clock speed of 1.7 thousand million cycles per second. 2 A small size of tall and narrow style of case containing the computer system. 3 256 megabyte of Rambus dynamic type of main memory chips that constitute the computer RAM. 4 A hard drive internal storage device with a capacity of approx. 60 thousand million bytes. 5 A video controller for controlling the monitor screen that is built on to the computer motherboard. It can process 3 В images using the AGP type of video bus interface. It also contains approx. 64 million bytes of synchronous dynamic random access memory that is used as video memory. 6 A soundcard that has 64 voices and generates sounds using the wave table system. 7 A CD-ROM storage device that operates at 48 times the speed of the original CD-ROM devices. 8 A colour monitor for displaying output on a screen at resolutions determined by the SVGA standard. The diagonal measurement of the whole screen is 19 inches but the diagonal measurement of the actual viewable area of the screen is only 17.9 inches. 9 The operating system that is used to control the system.
CHOOSING A CPU AND RAM CPU stands for Central Processing Unit. There can be several processors in a computer, but one of them is the central one – the CPU. The reason the CPU is called a processor is because it can work with data. And it has two important jobs: it can do calculations; it can move data. The CPU is very fast at doing both jobs. The faster the CPU can do calculations and move data, the faster we say the PC is. The CPU is physically quite small. At its core is an electronic circuit (called a die), which is no bigger than your little fingernail. If you happen to need to choose a CPU for your new PC, what should you choose? Let me give you a bit of food for thought.
The individual components have different lifetimes. The way development has gone up until the present, CPU’s and motherboards have been the components that have become obsolete the most quickly. CPU’s and motherboards also go together– you normally change them both at the same time. The question is, then, how important is it to have the latest technology? You have to decide that for yourself. But if you know that your PC has to last for many years, you probably should go for the fastest CPU on the market. For the rest of us, who regularly upgrade and replace our PC’s insides, it is important to find the most economic processor. There is usually a price jump in the processor series, such that the very latest models are disproportionately expensive. You have to find the model that gives the most power in proportion to the price. RAM stands for Random Access Memory. It is a very central component in a PC, for without RAM there can be no data processing. RAM is simply the storage area where all software is loaded and works from. Physically, RAM consists of small electronic chips which are mounted in modules (small printed circuit boards). The modules are installed in the PC’s motherboard using sockets — there are typically 2, 3 or 4 of these. However, RAM also has to match the motherboard, chipset and the CPU system bus. You can try experimenting with overclocking, where you intentionally increase the system bus clock frequency. That will mean you need faster RAM than what is normally used in a given motherboard. However, normally, we simply have to stick to the type of RAM currently recommended for the chosen motherboard and CPU. RAM has a very big impact on a PC’s capacity. So if you have to choose between the fastest CPU, or more RAM, I would definitely recommend that you go for the RAM. Some will choose the fastest CPU, with the expectation of buying extra RAM later, “when the price falls again”. You can also go that way, but ideally, you should get enough RAM from the beginning. But how much is that? If you still use Windows 98, then 256 MB is enough. The system can’t normally make use of any more, so more would be a waste. For the much better Windows 2000 operating system, you should ideally have at least 512 MB RAM; it runs fine with this, but of course 1024 MB or more is better. The same goes for Windows XP:
Recommended amount of PC RAM, which has to be matched to the operating system. The advantage of having enough RAM is that you avoid swapping. When Windows doesn’t have any more free RAM, it begins to artificially increase the amount of RAM using a swap file. The swap file is stored on the hard disk, and leads to a much slower performance than if there was sufficient RAM in the PC. What am I? I call myself _____________________ Sitting for hours, totally engaged In the world of the plastic box, filled with silicon chips Separated from reality By a screen Sometimes visible, touchable, breakable, Sometimes fusible, volatile, illusive.
I am ___________________________ Devouring the information juice Jumping from one page to another Tracking the latest news, gears and even gossip. Non-stop I am safe inside my anonymous coat Or sometimes seek shelter in nickname loads My existence, untraceable, undeniable.
I become _____________________________ Convicted of addiction and cyber malevolence Of health deterioration and privacy violation A list of concerns, denials, criticism and old folks' grievances. Never-ending How malignant a benign cancer! How threatening a vulnerable creature! My future, glorious or gloomy? What am I? You decide! UNIT 3 COMPUTER SOFTWARE OPERATING SYSTEMS
HOW COMPUTER VIRUSES WORK
A computer virus - an unwanted program that has entered your system without your knowing about it - has two parts, which I'll call the infector and the detonator. They have two very different jobs. One of the features оf a computer virus that separates it from other kinds of computer program is that it replicates itself, so that it can spread (via floppies transported from computer to computer, or networks) to other computers. After the infector has copied the virus elsewhere, the detonator performs the virus's main work. Generally, that work is either damaging data on your disks, altering what you see on your computer display, or doing something else that interferes with the normal use of your computer. Here's an example of a simple virus, the Lehigh virus. The infector portion of Lehigh replicates by attaching a copy of itself to COMMAND.COM (an important part of DOS), enlarging it by about 1000 bytes. So let's say you put a floppy containing COMMAND.COM into an infected PC at your office - that is, a PC that is running the Lehigh program. The infector portion of Lehigh looks over DOS's shoulder, monitoring all floppy accesses. The first time you tell the infected PC to access your floppy drive, the Lehigh infector notices the copy of COMMAND.COM on the floppy and adds a copy of itself to that file. Then you take the floppy home to your PC and boot form the floppy. (In this case, you've got to boot from the floppy in order for the virus to take effect, since you may have many copies of COMMAND.COM on your hard and floppy disks, but DOS only uses the COMMAND.COM on the boot drive.) Now the virus has silently and instantly been installed in your PC's memory. Every time you access a hard disk subdirectory or a floppy disk containing COMMAND.COM, the virus sees that file and infects it, in the hope that this particular COMMAND.COM will be used on a boot disk on some computer someday. Meanwhile, Lehigh keeps a count of infections. Once it has infected four copies of COMMAND.COM, the detonator is triggered. The detonator in Lehigh is a simple one. It erases a vital part of your hard disk, making the files on that part of the disk no longer accessible. You grumble and set about rebuilding your work, unaware that Lehigh is waiting to infect other unsuspecting computers if you boot from one of those four infected floppies. HOW TO AVOID VIRUSES Don't worry too much about viruses. You may never see one. There are just a few ways to become infected that you should be aware of. The sources seem to be service people, pirated games, putting floppies in publicly available PCs without write - protect tabs, commercial software (rarely), and software distributed over computer bulletin board systems (also quite rarely, despite media misinformation). Many viruses have spread through pirated - illegally copied or broken -games. This is easy to avoid. Pay for your games, fair and square. If you use a shared PC or a РC that has public access, such as one in a college PC lab or a library, be very careful about putting floppies into that PC's drives without a write-protect tab. Carry a virus-checking program and scan the PC before letting it write data onto floppies. Despite the low incidence of actual viruses, it can't hurt to run a virus-checking program now and then. There are actually two kinds of antivirus programs: virus shields, which detect viruses as they are infecting your PC, and virus scanners, which detect viruses once they've infected you. Viruses are something to worry about, but not a lot. A little common sense and the occasional virus scan will keep you virus - free. Vocabulary note: Fair and square - honestly It can't hurt - it's probably a good idea. PROGRAMMING LANGUAGES A programming language is an artificial language that can be used to control the behavior of a machine, particularly a computer. A prominent purpose of programming languages is to provide instructions to a computer. As such, programming languages differ from most other forms of human expression in that they require a greater degree of precision and completeness. When using a natural language to communicate with other people, human authors and speakers can be ambiguous and make small errors, and still expect their intent to be understood. However, computers do exactly what they are told to do, and cannot understand the code the programmer " intended" to write. The combination of the language definition, the program, and the program's inputs must fully specify the external behavior that occurs when the program is executed. Many languages have been designed from scratch, altered to meet new needs, combined with other languages, and eventually fallen into disuse. Although there have been attempts to design one " universal" computer language that serves all purposes, all of them have failed to be accepted in this role. The need for diverse computer languages arises from the diversity of contexts in which languages are used:
Programming languages, like natural languages, are defined by syntactic and semantic rules which describe their structure and meaning respectively. The syntax of a language describes the possible combinations of symbols that form a syntactically correct program. The meaning given to a combination of symbols is handled by semantics. Below is a simple grammar, based on Lisp:
expression:: = atom | list
This grammar specifies the following:
It is difficult to determine which programming languages are most used. Some languages are very popular for particular kinds of applications (e.g., COBOL is still strong in the corporate data center, often on large mainframes, FORTRAN in engineering applications, and C in embedded applications and operating systems), while some languages are regularly used to write many different kinds of applications. Translate into English Поведение машины контролируется с помощью искусственного языка, называемого язык программирования. Языки программирования обладают большой степенью точности. Некоторые языки служат всем целям, другие используются для определенного типа программ. Языки программирования постоянно изменяются, многие выходят из употребления. Все попытки создать универсальный язык программирования провалились. 62 Make a short report on the programming language you most often use. Speak about the following UNIT 4 READING ABOUT COMPUTERS Divide into two groups. Group 1: you are interested in new developments in IT. Group 2: you are interested in history of IT.
TEXT 1
NOTIONS OF INTELLIGENCE 1 It is quite possible to set out an approximate scale of intelligence: most people are more intelligent than most chimpanzees, a word processor is a more intelligent machine than a typewriter, etc. Nevertheless there is no scientific definition of intelligence. Intelligence is related to the ability to recognise patterns, draw reasoned conclusions, analyse complex systems into simple elements and resolve contradictions, yet it is more than all of these. Intelligence is at a higher level than information and knowledge, but below the level of wisdom. It contains an indefinable 'spark' which enables new insights to be gained, new theories to be formulated and new knowledge to be established. Intelligence can also be examined from the point of view of language. Information can easily be represented as words, numbers or some other symbols. Knowledge is generally expressed in a language or as mathematics. Intelligence is at the upper limit of language: instances of patterns or deductive reasoning can be written down, and certain general principles can be stated. However, the creative 'spark' of intelligence is almost impossible to express in language.
2 The only widely accepted definition of artificial intelligence is based on a test devised by Alan Turing in 1950: Suppose there are two identical terminals in a room, one connected to a computer, and the other operated remotely by a person. If someone using the two terminals is unable to decide which is connected to the computer, and which is operated by the person, then the computer can be credited with intelligence. The definition of artificial intelligence which follows from this test is: Artificial intelligence is the science of making machines do things that would require intelligence if done by people.
3 No computer system has come anywhere near to passing the Turing test in general terms. Nevertheless, progress has been made in a number of specific fields. It would take a very good chess player to be able to tell whether he or she were playing against a computer or a human opponent. Most car drivers are unaware which parts of their cars have been assembled by robots, and which by manual workers.
4 Machines can imitate human performance on many problems, but by using utterly inhuman techniques. Computer chess-players have no concepts of strategy; instead at each turn they scan through several billion different sequences of moves to pick the one that seems the best. Computer logicians make their deductions in ways that no human would – or could. Computer bureaucrats apply the rules more tirelessly and consistently than any of their overworked human brethren. Watching such machines at work nobody could mistake them from humans – or deny their intelligence.
5 The most obvious problem with Turing’s challenge is that there is no practical reason to create machine intelligence indistinguishable from human ones. People are in plenty supply. The point of using machines ought to be that they perform differently from people and preferably better. If that potential is to be exploited, machines will need to be given new forms of intelligence all their own. The real challenge, then, is not to recreate people but to recognise the uniqueness of machine intelligence, and learn to work with it. Working together man and machine should be able to do things that neither could do separately.
Answer the questions. 1 The author talks about: knowledge intelligence information and wisdom. Rank these in order from highest to lowest. 2 Whose test led to a definition of artificial intelligence? When was it devised? 3 What results do modern computers show in passing the test? 4 What contradiction can be found in Turing’s challenge? 5 What solution to the problem does the author suggest? TEXT 2 HI-TECH DVD WAR BREAKS OUT
1 High-definition DVD players and discs are the natural complement to the high-definition flat-screen televisions which enabled this summer's World Cup, and David Attenborough's Planet Earth, among other programmes, to be shown in spectacular detail. The new discs look like normal DVDs but can store more than five times more data, allowing higher picture and sound quality and more interactive features.
2 Britain's first high-definition DVD player demonstrated by Samsung promises a revolution in home cinema. But visions of the medium-term future are not so clear. Toshiba is launching a rival high-definition player which uses different technology and is incompatible with Samsung's system. A high-definition disc that plays on one type of machine will not work on the other. The format war has inevitably been billed as the most damaging since JVC's VHS system got the better of Sony's Betamax video cassettes 30 years ago.
3 Samsung's format is called Blue-ray and will also be used by other manufacturers including Sony, Philips, Panasonic, LG and Sharp, with the backing of Apple and Dell computers and Hollywood studios such as Disney and 20th Century Fox. Toshiba's format, known as HD-DVD, has supporters that include NEC, Microsoft and Intel, along with Universal and some European studios.
4 The potential difficulty for DVD collectors is illustrated on the AV Science Forum, an industry website in America. If the research is accurate, only owners of an HD-DVD player will be able to watch Schindler's List, Psycho, ET, To Kill a Mockingbird, Rear Window, Jaws, The Third Man, Vertigo and The Deer Hunter in full high definition. By contrast only owners of a Blue-ray player will be equipped to enjoy Lawrence of Arabia, On the Waterfront, The Bridge on the River Kwai, Some Like it Hot, Star Wars, Raging Bull, Dr Strangelove and Annie Hall in their best light. Many other titles will be released in both formats.
5 Industry analysts warn, however, that consumers, fearful of ending up with a machine that could be rendered obsolete, might play safe and buy neither. Some fans may be tempted to stick with their existing DVD libraries, which will work on both players and with improved picture quality, although not as well as the high-definition discs.
6 Samsung's BD-P1000 player will go on sale next month at $999, including two high-definition DVDs. Industry watchers have challenged the manufacturers to come up with players capable of showing both Blue-ray and HD-DVD. The Korean company rejected that suggestion: 'It's best for consumers and the industry to have a single format. Technically we could make a player that can handle both types, but we won't for two reasons: first it drives up costs; second it concedes that there will be different format types. Making a multiple-format player is not an answer. The best thing is one standard and Blue-ray is in the strongest position.' said David Steel, the vice-president responsible for digital media.
7 Toshiba has not pulled its punches in an increasingly fraught contest. Olivier Van Wynendaele, its assistant general manager for marketing consumer products, said: 'They argue that Blue-ray has more support from the studios but that is totally untrue. Europe is different from America - more and more American studios sell the rights to a local distribution company here. For example, Canal+ in France co-produced Terminator 2 so it has the rights in Europe and it will be on HD-DVD in Europe.' Although there is bitter disagreement over which technology offers the best picture quality, Toshiba says its product will be less than half the price of Samsung's - its HD-E1 player will be released in November at $400. In the end it could be price, not technology, that wins the war. ( www.guardian.co.uk ) Discuss the advantages and the disadvantages of quick technological changes. What advice would you give to computer developers and producers? What should they take into consideration to make technology more convenient for users? TEXT 3 X-Y POSITION INDICATOR FOR A DISPLAY SYSTEM Years before personal computers and desktop information processing became commonplace or even practicable, Douglas Engelbart had invented a number of interactive, user-friendly information access systems that we take for granted today: the computer mouse, windows, shared-screen teleconferencing, hypermedia, groupware, and more. At the Fall Joint Computer Conference in San Francisco in 1968, Engelbart astonished his colleagues by demonstrating the aforementioned systems using an utterly primitive 192 - kilobyte mainframe computer located 25 miles away! Engelbart has earned nearly two dozen patents, the most memorable being perhaps for his " X-Y Position Indicator for a Display System": the prototype of the computer " mouse" whose convenience has revolutionized personal computing. Engelbart's inventions were ahead of their time, but have been integrated into mainstream computing as industry capabilities have increased. It was not until 1984 that the Apple Macintosh popularized the mouse; but today it is difficult to imagine a personal computer without one. And the huge success of Microsoft's Windows95 proves that Engelbart's original windows concept has also become a virtual necessity. In a recent talk delivered at MIT (June 1996), Bill Gates himself praised Engelbart for his pioneering work. Byte magazine, in an article honoring the 20 persons who have had the greatest impact on personal computing (September 1995), went so far as to say of Engelbart: " Comparisons with Thomas Edison do not seem farfetched." Engelbart now works out of the Bootstrap Institute, which he founded, where he is an inventor and a consultant in multiple-user business computing. His current locus is on a type of groupware called an " open hyper document system, " which may one day replace paper record keeping entirely.
TEXT 4 A NEW APPLICATION OF LIGHT The digital compact disc, now commonplace in stereos and computers, was invented in the late 1960s by James T. Russell. Russell was born in Bremerton, Washington in 1931. At age six, he invented a remote-control battleship, with a storage chamber for his lunch. Russell went on to earn a BA in Physics from Reed College in Portland in 1953. Afterward, he went to work as a physicist in General Electric's labs in Richland, Washington. At GE, Russell initiated many experimental instrumentation projects. He was among the first to use a color TV screen and keyboard as the sole interface between a computer and an operator; and he designed and built the first electron beam welder. When in 1965, Battelle Memorial Institute opened its Pacific Northwest Laboratory in Richland, Washington, Russell joined the effort as senior scientist. He already knew what avenue of research he wanted to pursue. Russell was an avid music listener. Like many audiophiles of the time, he-was continually frustrated by the wear and tear suffered by his vinyl phonograph records. He was also unsatisfied with their sound quality: his experimental improvements included using a cactus needle as a stylus. Alone at home on a Saturday afternoon, Russell began to sketch out a better music recording system and was inspired with a truly revolutionary idea. Russell envisioned a system that would record and replay sounds without physical contact between its parts; and he saw that the best way to achieve such a system was to use light. Russell was familiar with digital data recording, in punch card or magnetic tape form. He saw that if he could represent the binary 0 and 1 with dark and light, a device could read sounds or indeed any information at all without ever wearing out. If he could make the binary code compact enough, Russell saw that he could store not only symphonies, but entire encyclopedias on a small piece of film. After years of work, Russell succeeded in inventing the first digital-to-optical recording and playback system (patented in 1970). He had found a way to record onto a photosensitive platter in tiny " bits" of light and dark, each one micron in diameter; a laser read the binary patterns, and a computer converted the data into an electronic signal which was then comparatively simple to convert into an audible or visible transmission. This was the first compact disc. Although Russell had once envisioned 3x5-inch stereo records that would fit in a shirt pocket and a video record that would be about the size of a punch card, the final product imitated the phonographic disc which had been its inspiration. Through the 1970s, Russell continued to refine the CD-ROM, adapting it to any form of data. Like many ideas far ahead of their time, the CD-ROM found few interested investors at first; but eventually, Sony and other audio companies realized the implications and purchased licenses. By 1985, Russell had earned 26 patents for CD-ROM technology. He then founded his own consulting firm, where he has continued to create and patent improvements in optical storage systems, along with bar code scanners, liquid crystal shutters, and other industrial optical instruments. His most revolutionary recent invention is a high-speed optical data recorder / player that has no moving parts. Russell earned another 11 patents for this " Optical Random Access Memory" device, which is currently being refined for the market. James T. Russell has many interests beyond optical data devices. In fact, he has claimed, " I've got hundreds of ideas stacked up — many of them worth more than the compact disc. But I haven't been able to work on them." Digital engineers and consumers alike will be lucky if he does find the time.
TEXT 5
SOON THE NET COULD BE HEALING ITSELF
IBM has unveiled an ambitious initiative to develop technologies that share the basic biological abilities of living organisms.
Senior researchers at the company said the growing complexity of computers and networks demands that the technology does a better job of maintaining and healing itself. The researchers warn that without these efforts there is a danger that networks will soon become unmanageable. This week IBM is sending 75, 000 copies of a manifesto written by Paul Horn, senior vice president of IBM Research, that details the aims of its Autonomic Computing initiative. Mr Horn warns that humans are losing the battle to manage the increasing complexity of computer systems and networks. This complexity is only going to increase as computer technology shrinks and finds its way into ever more devices. If the current rates of the expansion of digital technology are maintained, soon there would not be enough people to keep the world's computer systems running, he said. He called finding ways of handling this complexity the next " grand challenge" facing the technology industry. Ideally future networks should resemble the autonomic nervous system which maintains and monitors many basic bodily functions without conscious help. What is needed, argued Mr Horn, are computer systems that do a much better job of configuring themselves, can work around disruptions, heal any damage they suffer or fight off potential problems. IBM is planning its own research programs to create technologies that can turn relatively dumb networks into smarter alternatives. It is also planning to spend millions over the next five years funding 50 research projects at universities to take on the complex challenge. The likely outcome of the project is a series of software standards that define how to build software or hardware that has these more biological properties. IBM is working closely with the Global Grid Forum. This industry body is driving efforts to turn the disparate computing and research capabilities of the world's science labs into a shared pool of resources that anyone can plug into. This effort is already driving the creation of software that hides the individual quirks of individual machines and instruments behind common interfaces.
TEXT 6 APPLE MOVES ALL LAPTOPS TO INTEL
Apple is making the MacBook available in black and white Apple has moved closer to completing its shift to Intel chips used by its PC rivals. The company has launched new MacBook laptops to replace its previous consumer model, the iBook. Apple is in the process of moving all its computer products from IBM to Intel chips as part of its efforts to attract more consumers and increase its 5% share of the US market. The remaining Macs still running on IBM are the high-end desktop PowerMacs. " Apple began the transition to Intel Core Duo-based notebooks in February with the 15-inch MacBook Pro, and now just 90 days later we have completed the transition with the release of the all new MacBook, " said Philip Schiller, Apple's senior vice president of worldwide product marketing. The new laptops have much in common with the more expensive MacBook Pro models, such as the built-in webcam. The 13-inch widescreen MacBooks feature Intel Core Duo chips, with prices starting at £ 749 ($1, 099). Apple says the new chips mean the laptops are four to five times faster than their predecessors. The MacBooks come in time for the important back-to-school shopping season. Apple is hoping the Intel-based consumer laptops could tempt students away from Windows-based notebooks. In April, Apple released a program called Boot Camp that made it easy to install Windows XP on new Macs. The software opened a whole new world of compatibility to anybody with a Mac running one of the new Intel processors. Boot Camp is currently available as a free trial, and it is scheduled to form part of Leopard, the next version of Apple's OS X operating system.
APPENDIX 1 РЕФЕРИРОВАНИЕ Реферирование представляет собой интеллектуальный творческий процесс, включающий осмысление, аналитико-синтетическое преобразование информации и создание нового документа - реферата, обладающего специфической языково-стилистической формой. Рефератом называется текст, передающий основную информацию подлинника в свернутом виде и составленный в результате ее смысловой переработки. АННОТИРОВАНИЕ Аннотация - это предельно сжатая характеристика материала, заключающаяся в информации o затронутых в источниках вопросах. Аннотация включает характеристику основной темы, проблемы объекта, цели работы и ее результаты. В аннотации указывают, что нового несет в себе данный документ в сравнении с другими, родственными по тематике и целевому назначению. Виды аннотаций Существуют различные виды аннотаций в зависимости от назначения аннотации или от вида документа, на который составляется аннотация. С точки зрения объема аннотации подразделяются на краткие и развернутые (или подробные). Краткая аннотация, как правило, характеризует документ в определенном аспекте: уточнение тематического содержания, расшифровка или пополнение заглавия, оценка уровня материала и так далее. Развернутая аннотация часто представляет собой перечисление рубрик первичного документа. Она составляется в тех случаях, когда документ представляет значительный научный интерес, а также при описании многоаспектных документов (учебники, справочники, сборники и т.д.). С точки зрения метода анализа и оценки документа аннотации можно разделить на описательные (или справочные) и рекомендательные (в том числе и критические). Описательная аннотация дает общее представление о документе, в то время как рекомендательная аннотация характеризует тематику и содержание документа под определенным углом зрения. В информационной сфере наибольшее применение находит описательная аннотация. В зависимости от тематического охвата содержания документа аннотации делятся на общие и специализированные. Общие аннотации характеризуют весь документ в целом, они не ориентированы на определенный круг потребителей. В специализированных аннотациях находят отражения только те части, те аспекты содержания документа, которые интересуют потребителей данной информационной системы (данного круга читателей). В информационной практике используется, как правило, специализированная аннотация, рассчитанная на информирование специалиста определенной отрасли научной или практической деятельности. Такой вид аннотации целесообразен и при работе с литературой в учебном процессе - при подготовке рефератов, докладов и других научных работ студентами. Этапы аннотирования Аннотации всегда предпосылаются библиографические данные первоисточника (см. примеры аннотаций выше). В аннотациях обычно содержатся следующие данные: 1) предметная рубрика; 2) тема; 3) сжатая характеристика материала; 4) выходные данные (автор и заглавие статьи, название и номер периодического издания, где помещена статья, место и время издания).
APPENDIX 3
СЛОВООБРАЗОВАТЕЛЬНЫЕ АФФИКСЫ
ИНФОРМАЦИОННЫЕ ТЕХНОЛОГИИ Сборник текстов и упражнений по английскому языку для студентов 3 курса специальности «Информационные системы в экономике»
Составитель Ульянова Ольга Викторовна
Подписано к печати ___________2008 г. Формат 60х84/16. Бумага офсетная Плоская печать. Усл. печ. л. ___. Уч.- изд.л. ___ Тираж экз. Заказ № ________. Цена свободная ЮТИ ТПУ. 652050, Юрга, ул. Московская, 17. COMPUTER HISTORY AND APPLICATION 1 In groups or with a partner discuss the following questions. · Are you an advanced computer user? · How much time a day do you spend in front of the screen? · When did you learn how to use the computer? · How important is the computer in your studies and in your daily life? 2Say what devices can be used to do the following actions. Is there anything in the list that can be done only with the help of computer? To type and print documents, to play music, to send messages, to surf the net, to process data, to download and store information, to make graphs. |
Последнее изменение этой страницы: 2017-03-14; Просмотров: 1225; Нарушение авторского права страницы