As software emerged from hardware, it generated the new academic discipline of computer science. Computer scientists are not engineers, because they design algorithms not objects, but neither are they mathematicians, because their code runs on physical machines.
HCI began in the personal computing or PC era. Adding people to the computing equation meant that getting the technology to work was only half the problem — the other half was getting people to use it. Web users who did not like a site just clicked on, and only web sites that got hits succeeded. Given equal functionality , users prefer a more usable product Davis, ; e. Word replaced Word Perfect because users took a week to learn Word Perfect but picked up Word in a day. Just as computing had previously gained a software level, so it now gained a human level.
As computer science emerged from a combination of mathematics and engineering, so HCI is emerging from psychology and computer science. If psychology is the study of people and IT the study of software and hardware, then HCI is the study of psychology as it applies to IT.
It is the child of IT and psychology. It links CS to psychology as CS linked engineering to mathematics. This new meta-discipline applies psychological principles to computing design; e. Miller's paper on cognitive span suggests limiting computer menu choices to seven Miller, The nature of people now defines the nature of computing; e.
Even as HCI develops into a traditional academic discipline, computing has already moved on to add sociology to its list of paramours. Socio-technical systems use the social sciences in their design as HCI interfaces use psychology. STS is not part of HCI, nor is sociology part of psychology, because a society is more than the people in it; e.
East and West Germany, with similar people, performed differently as communities, as do North and South Korea today. A society is not just a set of people. People who gather to view an event or customers shopping for bargains are an aggregate, not a community.
They only become a community if they see themselves as one, i. Social systems can have a physical base or a technical base, so a socio-physical system is people socializing by physical means. Face-to-face friendships cross seamlessly to Facebook because the social level persists across physical and electronic architecture bases. Whether electronically or physically mediated, a social system is always people interacting with people. Copyright status: Unknown pending investigation.
See section "Exceptions" in the copyright terms below. Online communities work through people, who work through software that works through hardware. While sociology studies the social level alone, socio-technical design studies how social, human, information and hardware levels interact. A sociologist can no more design socio-technologies than a psychologist can design human-computer interfaces.
The complexity of modern computing arises from its discipline promiscuity Figure 1. Before going on, we review the opposing theory of reductionism, which states that there is only one level, namely the physical level, and so everything can reduce to it.
How has this worked in science? The reductionist dream is based on logical positivism footnote 12 , the idea that only the physical exists so all science must be expressed in physical terms. A message physically fixed in one way has by this definition zero information because the other ways it could have been fixed do not exist physically footnote It is logically true that hieroglyphics that cannot be read contain in themselves no information at all.
If reader choices generate information, the data in a physical signal is unknown until it is deciphered. Data compression fits the same data in a physically smaller signal by encoding it more efficiently. It could not do this if information was fully defined by the physical message. The physical level is necessary for the information level but it is not sufficient.
Conversely, information does not exist physically, as it cannot be touched or seen. So if the encoding is unknown, the information is undefined; e. The information a message conveys depends on the decoding process; e. One response to reductionism is mathematical realism, that mathematical laws are real even if they are not concrete Penrose, Mathematics is a science because its constructs are logically correct, not because they are physical.
That an equation is later physically useful is not the cause of its reality. Reality is now a consensual construct, with physicality just one option. The acceptance of mathematical and cognitive constructs does not deny science, because science only requires that theory constructs be validated empirically, i.
For example, fear as a cognitive construct can be measured by heart rate, pupil dilation, blood pressure, a questionnaire, etc. Even physics cannot reduce its theories to pure physicality, as quantum theory implies a primordial non-physical footnote 15 quantum level of reality below the physical Whitworth, In physics, reductionism gave a clockwork universe where each state perfectly defined the next, as in a computer.
Quantum physics flatly denied this, as random quantum events by definition are explained by no physical history. The quantum world cannot be reduced to physical events footnote Either quantum theory is wrong, or reductionism does not work. If all science were physical, all science would be physics, which it is not.
A reductionist philosophy that has failed in science in general is hardly a good base for a computing model.
If the physical level were sufficient alone, there would be no choices and so no information, i. Levels return the observer to science, as quantum theory's paradoxes demand. Currently, sociology sees individuals as conduits of meaning that reflect external social structures , and so psychological, biological, and physical views are the faulty reductionism of social realities.
In this social determinism , society writes social agendas, such as communism or capitalism, upon individual tabulae rasae blank slates. Yet this just replaces the determinism of fields like biology Wilson, and psychology Skinner, by another form of determinism.
By contrast, in the general system model of computing shown in Figure 1. So if all individual thoughts were erased, society would also cease to exist as surely as if all its citizens had vanished physically. Sociology assumes psychology, which has led to attempts to re-attach it to its psychological roots, e. The top-down return of sociology to its source matches an equally vibrant bottom-up movement in computing, which has long seen itself as more than hardware and software Boulding, The evolution of computing implies a requirements hierarchy Figure 1.
If the hardware works, then software becomes the priority; if the software works, then user needs become important; and if user needs are fulfilled, then social requirements arise. As one level's issues are met, those of the next appear, just as climbing one hill reveals another. As hardware over-heating problems are solved, software data locking problems arise. As software response times improve, user response times become the issue. Companies like Google and E-bay still seek customer satisfaction, but customers in crowds also have community needs like fairness, i.
In general, the highest system level defines its success ; e. If no community forms, it does not matter how easy to use , fast or reliable the software is. Lower levels become necessary to avoid failure but not sufficient to define success. Reduce community overload, clashes. Increase productivity, synergy, fairness, freedom, privacy, transparency.
Reduce information overload , clashes. Increase data processing, storage, or transfer efficiency. Conversely, any level can cause failure; it does not matter how strong the community is if the hardware fails, the software crashes or the interface is unusable. An STS fails if its hardware fails, if its program crashes or if users cannot figure it out.
Hardware, software, personal and community failures are all computing errors Table 1. The common feature is that the system fails to perform and in evolution what does not perform, does not survive footnote HCI systems based on meaning exchange fail from problems like misunderstanding or information overload. Socio-technical systems based on normative meme exchange fail from problems like mistrust, unfairness and injustice.
Computing as technology fails for technical reasons but, as socio-technology, also fails for social reasons. Technology is hard, but society is soft. That the soft should direct the hard seems counter-intuitive, but trees grow at their soft tips more than at their hard base. As a tree trunk does not direct its expanding canopy, so today's social computing advances were undreamt of by its engineering base.
This gives us a variety of design fields, as seen below. Courtesy of Ocrho. Copyright: pd Public Domain information that is common property and contains no original authorship. A: Apple controls meet human requirements. Courtesy of Unknown author. B: TV controls meet engineering requirements. Ergonomics designs safe and comfortable machines for people. Applying biological needs, such as avoiding posture and eye-strain, to technology design merges biology and engineering.
Object design applies psychological needs to technology in the same way Norman, : e. An affordance is a physical object feature that cues its human use, as buttons cue pressing.
Physical systems designed with affordances based on human requirements perform better. In World War II, aircraft crashed until engineers designed cockpit controls with the cognitive needs of pilots in mind, as follows with computing examples :. Provide continuous feedback, e. Human computer interaction applies psychological requirements to screen design.
Usable interfaces respect cognitive principles, e. HCI turns psychological needs into IT designs as architecture turns buyer needs into house designs. Both are controls footnote 20 but one is a cool tool and the other a mass of buttons. If one was designed to engineering requirements and the other to HCI requirements, which performs better? Fashion is the social requirement to look good applied to wearable object design.
In computing, a mobile phone can be a fashion accessory, just like a hat or handbag. Its role is to impress, not just to function. Aesthetic criteria apply when people buy mobile phones to be trendy or fashionable, so colour can be as important as battery life in mobile phone design.
Socio-technology is information technology meeting social requirements. Anyone online can see its power, but most academics see it as an aspect of their specialty, rather than a new multi-discipline in its own right.
Multi-disciplinary fields cannot, by their nature, be reduced to component discipline specialties; e. Higher levels direct lower ones to improve system performance. Levels cumulate, so the requirements of each level flow down to those below, e.
The same applies online, as online communities make demands of Netizens footnote 21 as well as software. STS design therefore is about having it all: reliable devices, efficient code, intuitive interfaces and sustainable communities. Note that the social level is open ended, as social groups form higher social groups, e. How social units combine into higher social units footnote 22 with new requirements is discussed further in Chapter 5. So it is naive to think that friend systems like Facebook are the last step, that social computing will stop at a social unit size of two.
Beyond friends are tribes, cities, city-states, nations and meta-nations like the USA. Since we have a friend but belong to a community, the rules also change. With the world population at seven billion and growing, Facebook's over million active accounts are just the beginning.
The future is computer support not just for friends, but also for families, tribes, nations and even global humanity. For example, imagine a group browser, designed for many not just one, so that people can browse the Internet in groups, discussing as they go.
Instead of a physical tour bus there is an informational tour browser. Or members could take turns to host the next site, showing what they like. The possibilities of social computing are just beginning. At each stage, a new specialty joined computing, but pure engineers still see only mechanics, pure computer scientists only information, pure psychologists only human constructs, and pure sociologists only social structures. Yet the multi-discipline of computing as a whole is not pure, because purity is not the future.
It is more akin to a bazaar than a cathedral, as computer practitioners understand Raymond, Like medieval fiefdoms, they hold hostage knowledge that by its nature should be free. The divide and conquer approach of reductionism does not allow computing to prosper as an academic multi-discipline. In practice, however, computing is thriving. A key to software evolution is also the fact that such programs must increasingly be adapted to work on different types of emerging computer equipment and within various operating system architectures so that the program has broader appeal.
Meeting all of these needs is crucial to determining if a software program remains viable, and, since software assets are such a crucial aspect of the information economy as of , software evolution has become a fundamental aspect of business adaptation and growth.
Meir Lehman, a computer scientist at the Imperial College of London, is credited with creating Lehman's Laws, which succinctly defined the process of software evolution and guided developers in the forward thinking of software visualization. Lehman's Laws are based upon the premise that software evolves as feedback on its performance increases and that its inevitable tendency is to become increasingly complex.
Lehman has stated that the nature of software evolution mirrors natural changes such as mutation in fruit flies, the way that cities expand over time, and how military structures incrementally improve upon weapons systems. The first three laws of the process emulate these trends in detailing Continuing Change, Increasing Complexity, and what is known as Large Program Evolution.
Continuing Change refers to the fact that the program must be adapted to meet current real-world business conditions, and this reflects Increasing Complexity as the program must meet an ever growing diversity of unexpected needs. When connected to a color television set, the Apple II produced brilliant color graphics for the time. Millions of Apple IIs were sold between and , making it one of the longest-lived lines of personal computers. Apple gave away thousands of Apple IIs to school, giving a new generation their first access to personal computers.
The TRS proved popular with schools, as well as for home use. The TRS line of computers later included color, portable, and handheld versions before being discontinued in the early s. The first of several personal computers released in , the PET comes fully assembled with either 4 or 8 KB of memory, a built-in cassette tape drive, and a membrane keyboard.
The PET was popular with schools and for use as a home computer. After the success of the PET, Commodore remained a major player in the personal computer market into the s.
The success of the VAX family of computers transformed DEC into the second-largest computer company in the world, as VAX systems became the de facto standard computing system for industry, the sciences, engineering, and research. Shortly after delivery of the Atari VCS game console, Atari designs two microcomputers with game capabilities: the Model and Model The served primarily as a game console, while the was more of a home computer.
Atari's 8-bit computers were influential in the arts, especially in the emerging DemoScene culture of the s and '90s. The Motorola microprocessor exhibited a processing speed far greater than its contemporaries. This high performance processor found its place in powerful work stations intended for graphics-intensive programs common in engineering. Intended to be a less expensive alternative to the PET, the VIC was highly successful, becoming the first computer to sell more than a million units.
Commodore even used Star Trek television star William Shatner in advertisements. About 50, were sold in Britain, primarily to hobbyists, and initially there was a long waiting list for the system. The machine was expandable, with ports for cassette storage, serial interface and rudimentary networking. The DN is based on the Motorola microprocessor, high-resolution display and built-in networking - the three basic features of all workstations. Apollo and its main competitor, Sun Microsystems, optimized their machines to run the computer-intensive graphics programs common in engineering and scientific applications.
Apollo was a leading innovator in the workstation field for more than a decade, and was acquired by Hewlett-Packard in IBM's brand recognition, along with a massive marketing campaign, ignites the fast growth of the personal computer market with the announcement of its own personal computer PC.
It featured a 5-inch display, 64 KB of memory, a modem, and two 5. Thousands of software titles were released over the lifespan of the C64 and by the time it was discontinued in , it had sold more than 22 million units. It is recognized by the Guinness Book of World Records as the greatest selling single computer of all time.
Franklin was able to undercut Apple's pricing even while offering some features not available on the original. Sun Microsystems grows out of this prototype. Sun helped cement the model of a workstation having an Ethernet interface as well as high-resolution graphics and the UNIX operating system.
Lisa is the first commercial personal computer with a graphical user interface GUI. It was thus an important milestone in computing as soon Microsoft Windows and the Apple Macintosh would soon adopt the GUI as their user interface, making it the new paradigm for personal computing. The success of the Portable inspired many other early IBM-compatible computers. Compaq's success launched a market for IBM-compatible computers that by had achieved an percent share of the personal computer market.
The Macintosh was the first successful mouse-driven computer with a graphical user interface and was based on the Motorola microprocessor. The PC Jr. While the PC Jr. It also included more memory and accommodated high-density 1. By the early s, Dell became one of the leading computer retailers. It developed a very loyal following while add-on components allowed it to be upgraded easily. The inside of the Amiga case is engraved with the signatures of the Amiga designers, including Jay Miner as well as the paw print of his dog Mitchy.
At 4 million operations per second and 4 kilobytes of memory, the gave PCs as much speed and power as older mainframes and minicomputers. The chip brought with it the introduction of a bit architecture, a significant improvement over the bit architecture of previous microprocessors. It had two operating modes, one that mirrored the segmented memory of older x86 chips, allowing full backward compatibility, and one that took full advantage of its more advanced technology.
It performed 2 million instructions per second, but other RISC-based computers worked significantly faster. Daniel Hillis of Thinking Machines Corporation moves artificial intelligence a step forward when he develops the controversial concept of massive parallelism in the Connection Machine CM The machine used up to 65, one-bit processors and could complete several billion operations per second. Each processor had its own small memory linked with others through a flexible network that users altered by reprogramming rather than rewiring.
Using this system, the machine could work faster than any other at the time on a problem that could be parceled out among the many processors. One of Britain's leading computer companies, Acorn continued the Archimedes line, which grew to nearly twenty different models, into the s.
The computer he created, an all-black cube was an important innovation. This object-oriented multitasking operating system was groundbreaking in its ability to foster rapid development of software applications. VTech, founded in Hong Kong, had been a manufacturer of Pong-like games and educational toys when they introduce the Laser computer. The RISC microprocessor had a bit integer arithmetic and logic unit the part of the CPU that performs operations such as addition and subtraction , a bit floating-point unit, and a clock rate of 33 MHz.
The chips remained similar in structure to their predecessors, the chips. What set the apart was its optimized instruction set, with an on-chip unified instruction and data cache and an optional on-chip floating-point unit. Combined with an enhanced bus interface unit, the microprocessor doubled the performance of the without increasing the clock rate.
Apple had initially included a handle in their Macintosh computers to encourage users to take their Macs on the go, though not until five years after the initial introduction does Apple introduce a true portable computer. Sales were weaker than projected, despite being widely praised by the press for its active matrix display, removable trackball, and high performance. The line was discontinued less than two years later.
It would serve as the model for several other significant multi-processor systems that would be among the fastest in the world. Based on Charles Babbage's second design for a mechanical calculating engine, a team at the Science Museum in London sets out to prove that the design would have worked as planned.
Apple's Macintosh Portable meets with little success in the marketplace and leads to a complete redesign of Apple's line of portable computers. All three PowerBooks introduced featured a built-in trackball, internal floppy drive, and palm rests, which would eventually become typical of s laptop design. The PowerBook was the entry-level machine, while the PowerBook was more powerful and had a larger memory.
The PowerBook was the high-end model, featuring an active matrix display, faster processor, as well as a floating point unit. The PowerBook line of computers was discontinued in Based on the Touchstone Delta computer Intel had built at Caltech, the Paragon is a parallel supercomputer that uses 2, later increased to more than four thousand Intel i processors.
More than one hundred Paragons were installed over the lifetime of the system, each costing as much as five million dollars. The Paragon at Caltech was named the fastest supercomputer in the world in Paragon systems were used in many scientific areas, including atmospheric and oceanic flow studies, and energy research.
Apple enters the handheld computer market with the Newton. The handwriting recognition software was much maligned for inaccuracy. The Newton line never performed as well as hoped and was discontinued in The Pentium introduced several advances that made programs run faster such as the ability to execute several instructions at the same time and support for graphics and music.
Using dual PowerPC CPUs, and featuring a large variety of peripheral ports, the first devices were used for software development. While it did not sell well, the operating system, Be OS, retained a loyal following even after Be stopped producing hardware in after less than 2, machines were produced.
Officially known as the Track Write, the automatically expanding full-sized keyboard used by the ThinkPad is designed by inventor John Karidis. The keyboard was comprised of three roughly triangular interlocking pieces, which formed a full-sized keyboard when the laptop was opened -- resulting in a keyboard significantly wider than the case.
Palm Inc. Sony had manufactured and sold computers in Japan, but the VAIO signals their entry into the global computer market.
The first VAIO, a desktop computer, featured an additional 3D interface on top of the Windows 95 operating system as a way of attracting new users. The VAIO line of computers would be best known for laptops were designed with communications and audio-video capabilities at the forefront, including innovative designs that incorporated TV and radio tuners, web cameras, and handwriting recognition.
The line was discontinued in Until the year , it was the world's fastest supercomputer, able to achieve peak performance of 1. The machine was noted for its ease-of-use and included a 'manual' that contained only a few pictures and less than 20 words. The camera had a maximum resolution of 0. The J-Phone line would quickly expand, releasing a flip-phone version just a month later.
Cameras would become a significant part of most phones within a year, and several countries have even passed laws regulating their use. A consortium of aerospace, energy, and marine science agencies undertook the project, and the system was built by NEC around their SX-6 architecture. To protect it from earthquakes, the building housing it was built using a seismic isolation system that used rubber supports.
The Earth Simulator was listed as the fastest supercomputer in the world from to Leaving Palm Inc. After retiring their initial Visor series of PDAs, Handspring introduced the Treo line of smartphones, designed with built-in keyboards, cameras, and the Palm operating system.
The Treo sold well, and the line continued until Handspring was purchased by Palm in With a distinctive anodized aluminum case, and hailed as the first true bit personal computer, the Apple G5 is the most powerful Macintosh ever released to that point. While larger than the previous G4 towers, the G5 had comparatively limited space for expansion.
Harkening back to the hobbyist era of personal computing in the s, Arduino begins as a project of the Interaction Design Institute, Ivrea, Italy. Each credit card-sized Arduino board consisted of an inexpensive microcontroller and signal connectors which made Arduinos ideal for use in any application connecting to or monitoring the outside world.
Nearly a quarter century after IBM launched their PC in , they had become merely another player in a crowded marketplace. Lenovo became the largest manufacturer of PCs in the world with the acquisition, later also acquiring IBM's server line of computers. Named in honor of the space shuttle which broke-up on re-entry, the Columbia supercomputer is an important part of NASA's return to manned spaceflight after the disaster.
Columbia was used in space vehicle analysis, including studying the Columbia disaster, but also in astrophysics, weather and ocean modeling.
At its introduction, it was listed as the second fastest supercomputer in the world and this single system increased NASA's supercomputing capacity fold. The first offering to the public required the buyer to purchase one to be given to a child in the developing world as a condition of acquiring a machine for themselves. By , over 2. Many companies have attempted to release electronic reading systems dating back to the early s.
Online retailer Amazon released the Kindle, one of the first to gain a large following among consumers. The first Kindle featured wireless access to content via Amazon.
0コメント