
In Depth: How the space race changed computing
Getting into space requires more than just rockets. With both lives and huge amounts of money on the line, NASA has had to advance not only computer hardware but also the techniques and technologies required for working on the cutting edge.
We take a look at the systems that have helped push mankind from the dawn of the space race to the edge of the unknown universe.
The Gemini rocket's control system was to be used as a test-bed for the Moon landings, so it had to do more than merely crunch numbers: it had to be error-proof, efficient and, above all else, small.
NASA gave IBM $26.6million to build a computer capable of running the necessary programs, and the specifications were brutal. IBM had to create a system capable of proving the feasibility of docking in space, all packed into a box no bigger than 19in high, 15in wide, 13in deep, and weighing less than 60 pounds.
While this may only be the same size and weight of a modern tower PC, comparable computers at the time took up whole rooms and required a staff of operators to keep them going.
The calculations required were no less advanced. The curved nature of orbit makes catching up to another craft and docking with it a confusing and dangerous procedure. For example, firing a rocket motor to get closer to another spacecraft actually puts your own ship into a higher orbit. Paradoxically, you have to aim lower than the other craft by a certain degree to catch it up, and that's difficult to do by hand and with limited fuel.
Gemini's onboard computer had to help fly the craft during six distinct mission phases: pre-launch (where it monitored the health of both itself and other onboard systems), blast off, achieving a stable orbit, catching a drone (dubbed Agena), docking with it and finally negotiating a safe re-entry.
To create the Gemini computer system, IBM engineers soldered together hundreds of individual transistors, resistors and capacitors. The CPU was so simple that it only understood 16 different instructions, although the finished machine could execute 7,000 of those instructions per second.
Interestingly, when writing the onboard software, IBM engineers pioneered the first use of the software engineering techniques that all avionics developers use today. They created modules that could be verified mathematically as being correct and which each performed just one simple task.
The care with which the computer was programmed didn't end there. Errors while loading the finished program from magnetic tape into the computer's 4kB of non-volatile core memory were an ever-present danger. Imperfections in the tape could create flipped bits and therefore bad instructions in the transferred program.
At the time, the generally accepted standard was one error in 100,000 bits. NASA demanded an accuracy of one error in 1,000,000,000. To achieve this, the program was recorded three times and special hardware checked that all three copies of each bit were identical before adding it to memory.
Landing on the Moon
Getting to the moon takes more than pointing a rocket towards the white thing in the sky – you need to know exactly where it's going to be in three days' time. As a result, entering lunar orbit, landing on the Moon, ascending again and then getting home all require computers.
Re-entry involves getting the angle exactly right so as not to fry the crew or have the Command Module skip off the atmosphere and back into space like a stone on a pond. NASA knew that the Apollo missions would require onboard systems capable of handling every aspect of the mission – and that meant a guidance system of unprecedented ingenuity.
It chose MIT to design the system. The institute had previously created the guidance system for the Polaris nuclear missile, so it was a natural choice to design, build and write the software for the iconic Apollo Guidance Computer (AGC) that would get men to the Moon and back.
To save weight, space and power, NASA and MIT decided to risk using newly invented integrated circuits. Each chip contained a circuit called a NOR gate, which produces an output only when none of its inputs receive a signal. Combined, these gates can form the other logic circuits needed to build a complete CPU.
It took 5,600 NOR gates to build the computer NASA would eventually send to the Moon. These were arranged into trays, each containing 24 modules. Each module contained 60 sub-modules, and each sub-module contained several NOR chips. The whole thing consumed 2.5 amps, and ran at 1MHz.

OLD TECH: This is a good example of the PCs sent into space back in the day
By modern-day standards that specification may look almost embarrassingly underpowered, but make no mistake – it was considerably more powerful than the 'pocket calculator' comparison would imply.
Each Apollo mission used two AGCs, one in the orbiting command module and another in the lunar module. Both were identical in hardware, but had different software weaved into their memories.
The astronauts communicated with the units using a display and keyboard setup (called a 'DSKY' for short). There were two DSKYs in the command module and one in the lunar module.
Perhaps most remarkably, the AGC contained just 36kB of memory holding programs critical to the mission, and was composed of ferrite cores woven by hand into a structure resembling macramأ©.
Software for any given mission had to be delivered months in advance so that it could be put in place, and mistakes were nearly impossible to correct.
Running tasks in space
Because of its hardware limitations, the AGC pioneered the use of the kind of priority-driven kernel we see in desktop operating systems today. Its programs were always one of two types: short tasks designed to take no more than 4ms, and jobs that were designed to run for longer.
Most importantly, the system watched out for failure. If a program on your desktop crashes, you can reboot the system hardware to restore normal operations. In space, that's not an option.
The AGS's operating system required a special monitor program designed to restart individual tasks whenever they took too long to respond (indicating that they'd failed to complete). Such a situation actually occurred during the descent of Apollo 11's Eagle Lunar Module.
With the world holding its breath, Eagle's rendezvous radar started swamping the program assigned to monitor it. Handling this data caused the program to fail with the now famous '1202 alarm' – a warning that the system was overloaded. Luckily for Neil Armstrong and Buzz Aldrin, Mission Control trusted that the reset program would do its job flawlessly.
The need for software without bugs led to the rapid development of advanced software verification techniques. NASA also acquired considerable experience in managing large, real-time software projects that would lead directly to the development of the fly-by-wire systems used first in fighter aircraft and now in commercial airliners.
But even before the world watched Armstrong and Aldrin happily bounce about on the lunar surface for the first time, NASA was already drawing up plans for an advanced reusable spacecraft that would be entirely fly-by-wire.
Flying the Space Shuttle
Partly because of its association with NASA, by the late 1960s IBM had a wealth of experience of flight computers. So rather than commission another bespoke system for the Space Shuttle, NASA chose to simply buy the IBM AP-101 avionics computer, as used in the B-52 nuclear bomber and the F-15 fighter.
Due to the complexity of the Shuttle, it uses five of these systems, re-christened the General Purpose Computer (GPC). All of the Shuttle's subsystems (radar, life support and so on) are interconnected by more than 300 miles of wiring, and are designed to share no fewer than 24 data buses running throughout the ship. The GPCs share eight of the buses.
On early flights, the crew also carried a Hewlett-Packard HP-41C calculator programmed to determine ground-station availability, and to tell them when to fire the re-entry retro rockets should it be necessary to attempt an emergency manual de-orbit.
For the safety of the crew and the expensive payloads it carries, four of the GPCs replicate each other's functions. If one obtains a result that the others don't, it's presumed to be wrong.
The fifth computer is programmed by a different team and acts as a backup if a second opinion is required. In addition, a sixth GPC is on board during each flight and can be swapped with a malfunctioning unit if necessary.
When the shuttle first flew in 1981, the GPCs each contained just 104kB of core memory. It wasn't until 1984 that NASA approved an upgrade to a faster processor and 128kB (expandable to 256kB) of silicon RAM. Due to technical problems, the upgraded computers didn't fly until 1991.
Trouble at the station
The International Space Station (ISS) is now seen as man's most successful permanent step off the planet: the pinnacle of space flight so far. But its off-the-shelf computers aren't exactly cutting-edge.
Other than the embedded systems that control the station itself, the crew use ordinary IBM Thinkpad laptop computers for general computing duties and experiments. Now manufactured by Chinese manufacturer Lenovo, they're used for their history of construction quality.
But things haven't always gone according to plan for the computers controlling the ISS. Once the sophistication of computers gets to a critical point, it seems inevitable that failures will occur. Onboard navigation computers regulate the station's position and angle using an array of gyroscopes and thrusters. They also control oxygen generation.
In 2007, these computers crashed due to a power surge while a visiting Shuttle crew deployed two new solar panels. Ground-based flight controllers had to use the Shuttle's thrusters to keep the space station at the right angle to the sun while Russian engineers raced to fix the problem.

MIR: Retired in 2001, Mir was the longest running space station and had people continuously on board for 10 years
However, after rebooting the navigation systems, an alarm sounded and flight controllers asked shuttle commander Rick Sturckow to re-enable Atlantis' autopilot to again keep the ISS in position. Eventually, the problem was traced to a blown circuit, and the station's navigation capability was restored.
Not all problems are so technical, however. One downside of off-the-shelf systems is the greater chance of human error creeping in, as when one crew member accidentally introduced the 32.Gammima.AG keylogger virus to the station on an infected laptop.This spread to several other laptops.
Described by NASA as a "nuisance", it's lucky the laptops in question were used only for non-critical tasks, including composing email and storing the results of nutritional experiments.
The unmanned future
NASA's new challenge is using computers to go where humans can't. Ground-based controllers still give the orders, but it's the machines' MAPGEN software that decides how best to carry them out. NASA says that MAPGEN is the first artificial intelligence software to have command of a rover on another planet.
As ever the hardware is underpowered, running on a stripped-down PC with just 128MB of RAM. The current Mars rovers work independently, but multiple vehicles could do far more. By relaying information along a chain of rovers, if one rover breaks down it wouldn't signal the end of the mission.
Some experiments, such as measuring seismic activity, will require multiple rovers. The problem is how to control them. Luckily, a group at NASA's Jet Propulsion Laboratory in Pasadena is working on the problem. One possible solution is MISUS (Multi-rover Integrated Science Understanding System), which is designed to enable collections of rovers to work together in intelligent ways.

SMART BOT: Networked teams of these little guys could one day be exploring where we can't go
MISUS will have a carefully created initial plan to work through, but changes in conditions and the results of onboard experiments will enable it to constantly review this plan and make modifications without asking Earth for help – which is vital given the long delays inherent in communicating across such distances.
Soon after the first brave men and women were shot into orbit, space flight became impossible without computers. However, humans require massive support to slip their earthly bonds for even just a few hours.
Computers have no such restriction. With developments in AI gathering speed, perhaps rather than humans colonising the solar system in the years to come, it'll be robots that don't mind taking one-way trip to far-flung planets. At least, we hope they won't…
Read More ...
Review: Medion Akoya E3211
The Medion Akoya E3211 is a CULV laptop, or Consumer Ultra Low Voltage for the uninitiated.
While it can't compete performance wise with many of the other laptops around this price, it is by far the most portable and will appeal to anyone who needs to work when regularly out and about.
CULV laptops are targeted at people who frequently travel and need their laptops with them, so a long battery life and thin and light design is crucial. To achieve both of these requirements, an Intel CULV processor has been used in this laptop, alongside 4096MB memory.
The processor is very resource light on the laptop's battery and you'll have 333 minutes of power to work with between charges. It also remains cooler than more powerful chips and so doesn't require a large cooling system, meaning the laptop's form factor can be kept down and the 1.8kg chassis will be no problem to carry around.
Performance can't compare with the rest of the laptops at this price and the laptop is limited to running office applications, such as word documents, spreadsheets and email clients.
An integrated GPU also means that multimedia functionality is very limited, and those after a laptop for editing video or photos should instead check out the Samsung Q320.
The 320GB hard drive will provide plenty of room for all your files and data, and the integrated DVD rewriter lets you read data from and burn files to CDs and DVDs. The 13.3-inch screen features a standard 1366 x 768-pixel resolution and is very sharp and bright.
Colour reproduction is decent, but not as good as the PC Nextday. A shiny Super-TFT screen coating is in place, but suppresses reflections very well, meaning there's not too much of an issue in bright or rapidly changing light.
Decent build quality
Build quality is good, and the laptop is put together well, aiding durability on the road. The very shiny lid quickly gathers dirt and grime, however, and requires regular cleaning to keep it smart and tidy.
The keyboard is comfortable to use and you'll have no problems striking up a fluid action, the only issue being that the keys are very flat and some might find it easy to get lost on the board when speed-typing.
802.11n Wi-Fi is joined by Gigabit Ethernet, making it easy to get online at high-speed wirelessly and also when using wired networks. There are also three USB ports for your peripherals and a VGA-out for connecting to an external analogue monitor when in the home or office.
While the Medion Akoya E3211 can't compete power-wise, it is by far the most portable and ideally suited to those who need a portable partner to accompany them on the road.
Related LinksRead More ...
Opinion: Lack of competition is stifling IT innovation
Windows 7 is 234 per cent more popular than its predecessor. It's official. OK, so that figure relates to the first few days of sales in the US, and the predecessor in question is Windows Vista, the Antichrist OS.
Even so, pathologically mediocre as it may well be, Windows 7 was well received. What interests me is how this reflects a broader malaise that continues to blight the PC industry.
What else but Microsoft's ongoing near-monopoly can explain the continued success of an operating system that sports a near-total absence of real innovation?
The broader problem involves the fact that the key components inside your PC, both software and hardware, are still owned by far too few companies. In just about any other industry of global import, the way Microsoft dominates the software landscape while Intel has the hardware platform largely sewn up and Google owns web searches would be viewed as unhealthy.
A handy analogue is the food industry in the US. If you've seen the documentary Food, Inc., you'll know what I'm talking about. According to the film's makers, key sectors in the US food industry have been whittled down from around 20 major players in the 1970s to just four mega-producers today.
Unsavoury practices
The result has been the emergence of a range of seriously unsavoury practices – the concentration of power in the hands of a handful of massive companies hasn't done anyone any good. Except those companies, of course.
Compare that to the PC industry and, if anything, the concentration of power looks much, much worse. It's a fact that both Microsoft and Intel, for example, have recently been subject to prosecutions for market abuses. But a plausible argument can still be made in terms of the benefits to the PC industry and end users.
Together, Intel and Microsoft provided developers with a single, unified platform and a massive customer base. Thus was born the astonishing ecosystem of PC-compatible applications and devices we take for granted today.
Moreover, I suppose we should all be grateful for what little competition there has been. Without AMD and ATI to keep Intel and Nvidia honest, for instance, we might now be marvelling at the power of single-core Intel Pentium 5 processors and Nvidia GeForce 4900 TI graphics.
Similarly, I scarcely dare imagine what horrors the Beast of Redmond would have sired were it not for the threat, however remote, of Apple's OS X and the opensource Linux operating system.
So, a lot of power and wealth may have been accumulated in the hands of a few thanks to the Wintel monopoly, but mankind has benefited enormously from the emergence of ubiquitous personal computing.
A democratic wave
Still, if I'm convinced it's all been worth it up to now, I'm equally sure the time has come for a more democratic wave of innovation. Fortunately, there are signs it's already happening.
Microsoft is increasingly under siege from all conceivable angles, whether it's the success of Linux as an enterprise OS or the arguably even more lethal threat posed by the humble web browser. Who needs a complex operating system if all your applications are hosted online?
Intel's hardware nut seems trickier to crack. Creating computer chips is a complex business – the idea of new entrants to the market is virtually inconceivable. However, the increasing importance of mobile devices might be the key.
Currently, ultra-mobile computing is dominated not by Intel chips but by ARM's processor architectures. Crucially, ARM's approach to producing CPUs is rather novel.
In fact, ARM doesn't really produce processors at all. Rather, it licenses out designs. This gives chip makers the option of simply knocking out an off-the-shelf design or fusing an ARM processor architecture with its own technology to create something unique.
As the remit for ultra-mobile devices expands over the next few years, so will the range and ability of ARM-based processors. Chips with all kinds of enhanced functions, from video decoding to cryptography acceleration, are likely to appear.
Intel recognises the threat posed by a plethora of purpose-built ARM processors and so has taken the bold step of licensing out the Atom processor architecture to TSMC, one of its main rivals in the chip production business. Again, the idea is to allow the Atom core to be combined with a range of third-party circuitry.
All of which means we're poised for a battle royal between ARM and Intel in the ultra-mobile segment.
Google, meanwhile, might just provide a similar foil for Microsoft. The result would be a perfect storm of hardware and software innovation. If that happens, the mediocrity of Windows 7 will be but a distant memory.
Read More ...
Review: Dell Inspiron 1750
The Dell Inspiron 1750 is a large, comfortable laptop that offers plenty of performance and, considering its size, portability.
The keyboard isn't very competitive, however, and we have some issues with its poor build quality.
The 17.3-inch screen is great. The 1600 x 900-pixel resolution is extremely sharp and images are satisfyingly detailed, making photos and films look great. You can comfortably work with multiple windows open, but the shiny Super-TFT screen coating frequently attracts distracting reflections.
Poor durability
Build quality is disappointing, with the chassis panels flexing under minor pressure. This reduces durability and we can't see this laptop surviving much of a drop.
This has a direct effect on usability, and the keyboard is one of the spongier options. The keys are nicely spread out, but the action is quite clunky and it's hard to strike up a good rhythm.The dedicated pad of numerical keys will suit those who regularly use spreadsheets and input data, however.
A dual-core Intel Pentium processor is in place, alongside 4096MB of DDR2 memory. This allows the machine to be suitably powerful during everyday use and we found no issues in running multiple office applications concurrently without any noticeable drop in performance.
More resource-intensive programs, such as digital photo editing suites, will also run smoothly, but the integrated Intel GPU strongly hinders more demanding multimedia use, with only the most basic gaming or video editing possible.
The 320GB hard drive isn't the most capacious, but provides plenty of space for storing thousands of files, photos and movies. The DVD rewriter lets you write and read DVDs without issue.
Considering this laptop's size, portability is decent and the 250-minute battery life impressed us. At 3.1kg, the machine isn't light, but you could certainly carry it around on short trips without issue.
802.11n Wi-Fi provides the fastest wireless technology, while 10/100 Ethernet allows you to set up and connect to wired networks with average speed at home or in the office.
The inclusion of three USB ports isn't particularly generous on a chassis of this size and there's no high-definition HDMI port, but an older VGA port is in place and makes it easy to connect to external analogue monitors.
Despite its durability and keyboard issues, the Inspiron 1750 is a comfortable desktop replacement that provides decent performance at a competitive price. Those after a little more should check out the Acer Aspire 7736G-663G25Mn, however.
Related LinksRead More ...


No comments:
Post a Comment