web analytics

Salora Manager

 

Salora was the computer in borrowed plumes

In the early 1980s, home computers flooded the world at an accelerating pace. One interesting step of the early computer era was the Salora Manager. It was a home computer that was not technically Finnish, but was sold under a Finnish brand name. Salora Manager was the Finnish version of the VTech Laser 2001 computer and offered consumers an affordable gateway to the world of information technology shortly before the Commodore 64 and MSX standard became widespread. The idea was quite clever. Many households had Salora radios and televisions, which had a good reputation and an existing distribution network in Nordic countries. So why not offer to consumers Salora home computers as well, the Salora marketing department must have thought. The project was launched, and the devices were named Salora Fellow and Salora Manager.

Salora Manager used a BASIC interpreter developed by Microsoft. Programs were loaded from C cassettes, but a 5.25-inch floppy disk drive was also available. There were also a few cartridge games, such as Auto Chase. The cassette drive was not built-in, but connected separately – often using a standard home stereo cassette player. The machine’s user interface opened into the BASIC development environment. Users could write programs and draw graphic patterns, for example. A small selection of games and programs was released for the Salora Manager. Since the device was based on the VTech Laser 2001, all of its programs worked on the Salora Manager – either on cassette or manually coded through the BASIC interpreter.

Salora Manager was obviously targeted at beginners and families who wanted an affordable way to get acquainted with computers. It was also a teaching tool and a platform for programming practice, rather than a gaming machine. Salora Manager failed to establish itself as the home computer market developed and quickly consolidated around a few options. In 1984–1985, it was overtaken by technically superior machines with a wider range of games, such as the Commodore 64 and MSX devices. The Salora Manager had a short life cycle, but it fulfilled its purpose in introducing Finns to information technology. Today, the Salora Manager is part of Finland’s information technology history. It is particularly valued as a domestic brand and a symbol of the spirit of the times. In retro collections, the machine is an interesting curiosity – a reminder of a time when information technology was still new, exciting, and somewhat experimental. Salora Manager was part of the global VTech company, but in Finland it gained its own identity. It brought affordable home computers to Finnish homes at a time when computers were not yet commonplace. Salora Manager is an important part of the traveling I love 8-bit® computer exhibition organized by the Kallio Computer Museum, where visitors can try out the device. A few years after Salora Manager, Nokia began its global conquest with its own mobile phone products. Despite the modest background, Finns were able to to develop world-class consumer products in information technology just few year after the unlucky Salora Computers.

 

 

Memotech MTX-512

The home computer from Memotech:
It’s not MSX. It is MTX!

In the early 1980s, Britain became one of the most vibrant and competitive home-computer markets in the world. From the modest ZX81 to the BBC Micro, personal computing in the United Kingdom was experiencing a golden age. Every few months, a new micro appeared on the shelves, promising to bring the future into the living room. Yet by 1984, that market was also beginning to strain under its own weight. Dozens of small companies entered the race, and just as quickly vanished when the public’s enthusiasm cooled or when giants such as Sinclair, Commodore, and Amstrad tightened their grip. Into this volatile world came a company called Memotech Ltd., and with it, one of the most stylish and technically refined British home computers ever built: the Memotech MTX.

Memotech was not originally a computer manufacturer. Founded by Geoff Boyd and Robert Branton in Oxfordshire, the company gained early success producing high-quality RAM expansions for the Sinclair ZX81, a machine famous for its affordability but equally notorious for its limitations. Memotech’s metal-cased expansions were praised for their reliability and design, and soon the firm decided that it could build an entire computer to the same standard. What emerged was the MTX series—machines that would stand out for their professional appearance, strong build quality, and advanced specification for the time.

The first model, the MTX 500, appeared in mid-1983, followed soon by the MTX 512 and later the RS128. All shared the same elegant black brushed-aluminum case, a far cry from the plastic shells of most of their rivals. Inside, the machines ran on a Zilog Z80A processor clocked at 4 MHz, with either 32 KB or 64 KB of RAM. Their Texas Instruments video chip could display 256 × 192 graphics, sixteen colours, and up to thirty-two hardware sprites—capabilities that compared favourably to the contemporaneous ZX Spectrum and even challenged the newer MSX machines. Sound came from the SN76489A chip, offering three tones and noise. The keyboard was full-sized, with eighty keys and a solid mechanical action that felt almost luxurious. In an era when rubber chiclets and wobbly keys were common, the MTX looked and felt like a serious instrument.

Technically, the MTX was versatile. Its expansion ports allowed the attachment of disk drives, serial and parallel interfaces, and even the running of the CP/M operating system through an external module called the FDX. That meant business software such as WordStar or dBase could, in principle, run on a home machine. The built-in BASIC interpreter, stored in ROM, was powerful and included commands for graphics and sound that made the machine friendly to programmers. Memotech clearly wanted to straddle both the hobbyist and professional markets—appealing to the home user who enjoyed games and coding, but also to schools and small businesses seeking a capable yet affordable CP/M system.

Yet the MTX entered the market at a perilous time. The British micro boom was beginning to falter. The public that had enthusiastically bought computers in 1982 and 1983 was now more cautious, and retailers were flooded with unsold stock. Commodore had driven prices down with the VIC-20 and later the dominant Commodore 64, while Amstrad was preparing to launch its integrated CPC 464 at an aggressive price. The MTX, with its premium metal case and full-sized keyboard, inevitably cost more—around £275 at launch for the 32 KB model. For many families, that was a difficult proposition when cheaper and better-supported alternatives existed.

Software support proved to be the decisive weakness. Despite its technical strengths, the MTX arrived without a strong base of games or educational titles, and developers were reluctant to commit to another new platform. Memotech hoped that compatibility with the emerging MSX standard might help, but the MTX ultimately differed just enough to make direct software sharing impossible. Without the ability to run Spectrum or Commodore programs, and lacking an MSX badge, it occupied an awkward no-man’s-land between standards. A few good titles appeared—Attack of the Mutant Camels, Kilopede, and Flight Simulator among them—but they were not enough to establish a thriving ecosystem.

Still, for the small number of users who did buy one, the MTX was a delight. Programmers appreciated the machine’s fast BASIC and the ability to write in assembler using the built-in monitor. The graphics and sound chips offered creative potential, and the solid keyboard made it a pleasure to type on. The optional FDX system, with its twin floppy drives and CP/M compatibility, turned the MTX into a credible small-business computer. In educational settings, it offered durability and expandability. There was a sense among enthusiasts that the MTX was a machine for those who cared about quality rather than fashion—a connoisseur’s choice.

Unfortunately, quality alone could not save it. Memotech invested heavily in production facilities, expecting large sales volumes that never materialized. The company also pursued ambitious export deals, including a proposed contract to supply computers to Soviet schools, but the political and logistical complexities of the Cold War scuttled the plan. By 1985, unsold stock piled up, and the firm was forced to slash prices drastically: the MTX 500 fell to under £80 in some clearance sales. Not long afterward, Memotech went into receivership. Production of the MTX line ceased, and the remaining inventory gradually disappeared from the market.

In hindsight, the Memotech MTX’s failure was not due to poor engineering but to timing and market realities. By 1984, consumers were increasingly driven by price and software libraries rather than by hardware elegance. The ZX Spectrum dominated the home-gaming market through sheer volume and developer support. The BBC Micro had captured the education sector. Commodore and Amstrad were fighting over the mainstream, leaving little room for a stylish outsider. Even in business computing, the CP/M niche was rapidly being replaced by IBM-compatible PCs. The MTX was a machine caught between worlds: too refined and expensive for the casual user, too small and incompatible for the business world.

Yet for all its commercial disappointment, the MTX left a mark on computing culture. Collectors today still admire its craftsmanship, the smoothness of its keyboard, and the understated beauty of its aluminum shell. In many ways, it symbolized what was best about the British microcomputer era: a spirit of engineering ambition, a willingness to innovate, and a belief that personal computing could be elegant as well as accessible. Though only a few tens of thousands were ever sold, the MTX remains a favourite among retro-computer enthusiasts who see in it the road not taken—the idea that a British machine could compete on quality, not just price.

The story of Memotech and its MTX computers is, in the end, both inspiring and tragic. It demonstrates how talent and vision can produce remarkable technology, yet also how unforgiving the marketplace can be. The MTX stood proudly among the crowded ranks of 1984’s home computers, its metal gleaming where others offered brittle plastic, but when the dust settled, it was the mass-market machines that survived. Memotech disappeared by 1985, leaving behind only the memory of a beautifully built machine that arrived just a little too late and cost just a little too much.

Today, when one powers up a surviving MTX and sees its clean blue screen flicker to life, it is easy to imagine what might have been. In a time when computing was still an adventure, the Memotech MTX represented both the dream of perfection and the reality of the marketplace. It is a reminder that technology’s history is written not only by the winners, but also by the elegant, doomed machines that dared to compete.

Atari XEGS

The last 8-bit endeavor from Atari

By the mid-1980s, the home-computer and videogame-console markets were undergoing significant change. The dramatic crash of the videogame market in the United States beginning in 1983 underscored how saturated the market had become and how difficult it was for incumbents to revive growth. Meanwhile, the 8-bit home-computer market (machines like the Atari 800XL, the Commodore 64, and others) was seeing both its heyday and the first signs of transition to next-generation systems. Into this environment stepped Atari (via its incarnation as Atari Corporation after the company restructure under Jack Tramiel) with the XEGS in 1987. The goal was to blend the worlds of videogame console and home computer, offering compatibility with the existing Atari 8-bit computer line while presenting a more console-oriented form for families and gamers.

The Atari XEGS (Atari XE Video Game System) was essentially a redesign of the Atari 65XE home computer (part of the XE family) packaged as a games console with the optional addition of a keyboard to convert it into a fully capable Atari 8-bit computer. It was released in 1987.The machine used a MOS 6502C (often 1.79 MHz NTSC or 1.77 MHz PAL) CPU, 64 KB of RAM (onboard) and ran the familiar Atari 8-bit architecture: chips like ANTIC and GTIA for graphics, POKEY for sound. For media, it used cartridge format and also accepted most of the older Atari 8-bit software and hardware — hence it could in theory serve as both console and computer. One key packaging variant was the “basic” set (console + joystick) and the “deluxe” set (console + joystick + detachable full-keyboard + XG-1 light-gun).

Sales and production figures

Exact global sales figures for the XEGS are somewhat elusive and inconsistent. One credible data point is that Atari sold approximately 100,000 XE Game Systems during the Christmas 1987 launch period, a figure described as “every unit produced during its launch window”. Some other sources note that overall support and new game releases tapered off after 1988, and that the 8-bit line (including the XEGS) was discontinued by Atari by early 1992.One source suggests that the XEGS did not feature at all in Atari’s annual reports after 1990, implying limited ongoing production. Therefore, while the 100,000 units number gives a snapshot for the launch, the total lifetime sales figure may have been somewhat higher but still modest compared with major console players of the time.

What the press and magazines wrote, positives and negatives

Contemporary and retrospective commentary on the XEGS emphasised a mix of promise and limitations. On the positive side, reviewers appreciated that the XEGS offered a dual-mode device: for users who wanted a console (plug in, joystick, game) it served that, while for those willing to attach the keyboard it became a full home computer with the rich Atari 8-bit software and peripherals. For example, one reviewer called the combination “a brilliant idea” for users who “didn’t have the foggiest idea what to do with a computer … [but] would have no compunction about buying a great video-game system”. In hardware terms, the leveraging of the existing 8-bit Atari architecture meant that compatibility (in many cases) with prior software/hardware was possible, and the detachable keyboard allowed a low-entry price for console buyers.

On the negative side, the press and analysts were critical of several aspects. Despite claims of compatibility, not all older Atari 8-bit cartridges and peripherals ran 100% smoothly on the XEGS — some games required translation or did not run as expected. Some are full-disk games that take anywhere from 1-4 disks … the end result is that games like ‘Astro Chase’ … won’t run on an XEGS system.” The hardware itself was already somewhat dated by launch: the architecture was essentially mid-early 1980s technology being pushed in 1987. The software library and developer support were weak compared to major competitors — few new flagship games were developed specifically for the XEGS. One source notes that after 1988 there were virtually no new releases. Marketing seemed under-whelming and the console/computer duality may have generated confusion in the market: Was this a console for games, or a computer for hobbyists? Neither message was pushed strongly enough to differentiate. In console form the XEGS lacked the cachet and ecosystem of leading rivals such as the Nintendo Entertainment System (NES). Some hardware aspects such as the light-gun (XG-1) were judged sub-par in accuracy. Thus, while the XEGS held conceptual appeal, execution and market timing limited its impact.

Atari XEGS – the reasons to get one 1980’s

As a plug-in console with joystick, for families seeking an affordable gaming system with some pedigree (Atari brand) and a library of immediately playable cartridges. As a “starter computer” for households where the idea of a keyboard and programming hobby appealed, but full computer systems (dedicated home computers) seemed more expensive or complex. Because the XEGS could remain a console but later be expanded by adding the keyboard, it offered flexibility. For users already owning Atari 8-bit software/hardware, the XEGS offered backward compatibility and a convenient way to reuse existing cartridges, peripherals and cassette/disk drives. For educational purposes: the full computer mode (with keyboard) provided access to programming (Atari BASIC Revision C) and the wide range of educational software developed for the Atari 8-bit line. In markets where new console options were expensive and older 8-bit machines were still viable, the XEGS might have offered good value.

The lifespan of the XEGS was relatively short. Although released in 1987, support and production of the Atari 8-bit line (including the XEGS) was officially discontinued by December 1991. After that date, the XEGS was no longer actively supported and became effectively obsolete in the face of 16-bit machines and emerging console generations. One retrospective source notes that the system “did worse than the Atari 7800 … and was yet another instance of Atari failing to save itself from Nintendo’s increasingly domineering presence.” In effect, the XEGS occupies a transitional niche: a late 8-bit Atari product that tried to straddle the console-and-computer boundary but ultimately did not secure a large market share or long lifespan.

Competition and market context

When the XEGS launched in 1987, the competitive environment was formidable. On the console side, Nintendo’s NES dominated the market in many territories. On the computer side, 16-bit home computers (e.g., the Atari ST line, the Commodore Amiga) were beginning their rise. The Atari XEGS had to compete not only with dedicated consoles offering strong marketing and fresh libraries, but also with home computers offering more advanced capabilities for programming, graphics and higher memory. In addition, the fact that the XEGS was architecturally tied to older 8-bit technology meant it lacked the “wow factor” of newer machines and thus struggled to differentiate. Thus, the XEGS’s competitive disadvantage stemmed from being both too late to the console race and too modest compared to emerging computers.

Legacy

Though the Atari XEGS was not a major success, it remains of interest to retro-computing and retro-gaming enthusiasts. It represents a “bridge” model — a home console built from a home computer architecture and intended for both gaming and computing. The fact that it is compatible with the wider Atari 8-bit ecosystem gives it a broad software base for hobbyists. For many collectors, the XEGS (especially in its deluxe keyboard + light-gun bundle) is a notable piece of Atari history and a symbol of the company’s efforts to reposition itself in the late 1980s. While the machine did not turn the tide for Atari, it is remembered as a bold if flawed attempt to straddle multiple market segments at once.

The Atari XE Game System (XEGS) was introduced in 1987 as an attempt by Atari Corporation to merge the worlds of console gaming and home computing, leveraging its existing 8-bit computer architecture (the XE line) in a new form factor. While it offered flexibility, compatibility and reasonable hardware for the era, it suffered from outdated technology, weak software support, and fierce competition from both dedicated consoles and emerging home computers. Although launch-sales of around 100,000 units indicate some initial interest, the limited lifespan (discontinuation by ~1991) and modest impact on the market underscore its niche status. For users seeking an affordable console or introduction to computing in the late 1980s, the XEGS may have made sense—but in the evolving landscape of videogames and computers its capabilities were already somewhat behind the curve. Nonetheless, in retrospect it serves as an interesting footnote in Atari’s history and the creative cross-pollination of console and computer design.

Popular games 

Here are some games presented that were popular on Atari XE.

Boulder Dash
Bug Hunt (light gun)
Dig Dug
Donkey Kong
Flight Simulator II
Frogger
Pac-Man
Pitfall!
River Raid
Zaxxon

 

Atari Jaguar

The story of a 64-bit Atari Jaguar
The dream that flopped

In the annals of video game history, few consoles embody the paradox of ambition and failure as clearly as the Atari Jaguar. Marketed boldly as the world’s first “64-bit” gaming console, the Jaguar aimed to re-establish Atari as a major force in the gaming industry during the early 1990s. But despite promising technology and nostalgic brand power, the Jaguar ultimately became one of gaming’s most infamous commercial failures. By the late 1980s, Atari Corporation, led by Jack Tramiel, was a diminished shadow of its former self. Once dominant in both home computers and gaming consoles, Atari had failed to match the success of Nintendo and Sega in the lucrative home console market. Systems like the Atari 7800 had been largely ignored by consumers, and the company’s attempts to innovate with handhelds like the Atari Lynx had also struggled to capture a mainstream audience. Desperate to reassert itself, Atari pinned its hopes on a next-generation console that would leapfrog its competitors technologically: the Atari Jaguar.

Development of the Jaguar began in the early 1990s, under the codename Project Jaguar. Atari partnered with Flare Technology, a group of engineers, to design the new hardware. The goal was clear: create a system powerful enough to surpass the Sega Genesis and Super Nintendo Entertainment System (SNES), and ideally challenge upcoming 32-bit consoles like the Sony PlayStation and Sega Saturn.

Atari’s engineers developed an innovative architecture:

  • Two custom 32-bit processors, named “Tom” and “Jerry.” Tom handled graphics and video output, while Jerry focused on audio and co-processing tasks.
  • A Motorola 68000 CPU, the same processor used in the Sega Genesis and Atari ST, primarily tasked with handling control logic and compatibility.
  • Atari marketed the system as a “64-bit” console, arguing that the combined capabilities of the two 32-bit processors justified the label. This claim was heavily disputed, but Atari insisted the architecture offered genuine 64-bit performance.

The console used ROM cartridges for media, eschewing the CD-ROM trend, although a Jaguar CD peripheral would be released later.

Release and Market Launch

The Atari Jaguar was officially launched in November 1993 in select markets in the United States. Priced at $249.99, it was competitively priced compared to the SNES and Genesis. The initial rollout was limited, focusing on major urban centers before expanding nationally.

Atari heavily promoted the Jaguar’s “64-bit” architecture as its key differentiator. However, at launch, the Jaguar suffered from a critically small library of titles. The initial batch of games included:

  • Cybermorph (bundled with the console)
  • Trevor McFur in the Crescent Galaxy
  • Raiden (an arcade port)

While Cybermorph demonstrated 3D graphics beyond what SNES and Genesis could produce, critics found the gameplay repetitive and the visuals unimpressive for a “next-gen” system.

Press Reception: Hype Meets Skepticism

The press initially covered the Jaguar with cautious optimism. Atari’s bold claims attracted attention, and the prospect of a “64-bit” console intrigued consumers in an industry dominated by 16-bit systems.

However, reviews quickly turned critical:

  • The Jaguar’s unique architecture proved difficult to program. Developers often relied on the underpowered 68000 processor instead of exploiting the dual custom chips, leading to underwhelming performance.
  • Game libraries remained thin, and key titles were delayed.
  • The graphical leap was inconsistent: while 3D polygon graphics were possible, many games used 2D sprites, leading to comparisons with 16-bit systems rather than next-generation rivals.

Many reviewers began to view Atari’s “64-bit” claims as a marketing gimmick rather than a technological reality.

Despite an aggressive marketing campaign, the Jaguar struggled to gain traction:

  • In 1993, only around 17,000 units were sold.
  • By 1994, as availability expanded, sales increased, but not enough to challenge Nintendo or Sega.
  • In total, Atari sold approximately 150,000 to 250,000 Jaguar units globally during its lifespan.

In comparison, the SNES and Genesis each sold tens of millions of units.

In an attempt to address the limitations of cartridge media, Atari released the Jaguar CD peripheral in 1995, priced at $149.99. This add-on allowed the Jaguar to play CD-based games and offered multimedia features like CD audio playback. By 1996, Atari ceased production of the Jaguar and effectively exited the hardware business. Facing mounting financial losses, Atari Corporation merged with hard drive manufacturer JT Storage (JTS), marking the end of Atari as an independent gaming company. In 1998, Hasbro Interactive acquired the rights to the Atari brand. In 1999, Hasbro officially declared the Jaguar an “open platform,” allowing developers to create and distribute new software without licensing restrictions. This led to a small but dedicated homebrew community.

The Atari Jaguar remains one of gaming history’s most infamous failures — a case study in overpromising and underdelivering. Yet, its story is more nuanced:

  • Technologically, the Jaguar was ahead of its time in some respects, offering 3D graphics capability before the PlayStation and Saturn.
  • Its complex architecture hindered software development, a fatal flaw that limited its library and stifled third-party support.
  • Marketing missteps, poor game availability, and fierce competition doomed the console despite its potential.

Nevertheless, the Jaguar has earned a cult following among retro gaming enthusiasts. Titles like Tempest 2000, Alien vs. Predator, and Iron Soldier are fondly remembered as standouts on the platform.

The Atari Jaguar was both the final home console released by Atari and its final major attempt to reclaim its place in the gaming industry. Though it failed commercially, the Jaguar remains a testament to the company’s enduring spirit of innovation — even if that innovation was ultimately flawed. Today, the Jaguar symbolizes the end of an era. Atari, once a pioneer, exited the console market after the Jaguar’s failure, leaving the industry to companies like Sony, Nintendo, and Sega. But for a brief moment in the early 1990s,

 

Atari 130XE

The Flagship of a 8-bit Atari

In the world of home computing during the 1980s, few companies had as turbulent and transformative a history as Atari. Founded in 1972 and initially known for pioneering video games like Pong, Atari became synonymous with early gaming culture. However, following corporate upheaval in the early 1980s, the company underwent a radical transformation. This change was largely driven by one man: Jack Tramiel, the legendary founder of Commodore. His acquisition of Atari’s consumer division in 1984 marked a new chapter in the company’s history, focused on affordable, powerful home computers. Central to this era was the release of the Atari 130XE, a computer that embodied Tramiel’s philosophy of providing “computers for the masses, not the classes.” Jack Tramiel was a Polish-born entrepreneur who built Commodore International into a giant of the personal computing industry during the late 1970s and early 1980s.

His aggressive pricing strategies and focus on mass-market appeal made the Commodore 64 the best-selling computer of all time. In 1984, after a bitter falling-out with Commodore’s board of directors, Tramiel left the company he had founded. Seeking a way back into the industry, he seized an opportunity when Warner Communications, the owner of Atari, decided to sell off Atari’s struggling consumer division. Warner had suffered significant losses after the infamous video game crash of 1983 and was eager to offload its floundering hardware division. In July 1984, Jack Tramiel acquired Atari’s consumer electronics and home computer business, renaming it Atari Corporation.His goal was clear: to make Atari once again a leading and innovative manufacturer of affordable computers that could challenge his old company, Commodore. How does it feel to be in a situation where you are trying to beat the company you founded? One can only imagine how Tramiel felt. It was a small world, because in the 1970s, Apple founder Steve Jobs had started his career at Atari, and soon Atari was competing against Apple with its Atari ST computers.

Under Tramiel’s leadership, Atari quickly restructured its product line. In 1985, Atari introduced the 130XE, part of its new XE (XL Extended) line of 8-bit computers, alongside the 65XE and the gaming-focused XEGS. The Atari 130XE was positioned as the company’s top-tier 8-bit home computer. Though technologically based on the earlier Atari 800XL, the 130XE boasted key enhancements:

  • 128 KB of RAM, a significant upgrade compared to the 64 KB of its predecessor.
  • Compatibility with existing Atari 8-bit software and peripherals.
  • The same advanced graphics and sound capabilities that had made Atari’s 8-bit line famous, including ANTIC and GTIA graphics chips, and the POKEY sound chip.
  • Support for bank-switching to access the full 128 KB of RAM.

In design, the 130XE adopted a sleeker, modernized case with a grey-and-black color scheme, intended to signal a break from the past and align with the aesthetics of Atari’s newly launched ST computers. Atari’s 8-bit computers had a reputation for excellent graphics and sound, rivalling or surpassing contemporaries like the Commodore 64 in certain areas. Popular games like Star Raiders, Ballblazer, and Rescue on Fractalus! showcased the system’s capabilities. Schools and parents continued to value 8-bit computers for teaching programming and basic computing skills. The 130XE’s compatibility with Atari BASIC and its expandability made it ideal for learners. The expanded memory allowed users to run more advanced software, including word processors and spreadsheets, many of which previously struggled on 64 KB systems. The 130XE’s increased RAM opened up possibilities for more complex programs, homebrew software development, and experimental applications. Additionally, aggressive pricing and widespread compatibility with existing software and peripherals made the 130XE an attractive upgrade option for owners of older Atari 8-bit computers.

Market Performance and Sales

Exact sales figures for the Atari 130XE are difficult to determine, but it is estimated that Atari Corporation sold hundreds of thousands of units worldwide between 1985 and the late 1980s. While not achieving the mass success of the Commodore 64, the 130XE contributed significantly to Atari’s resurgence as a computer company under Tramiel’s leadership. In the United States, sales were modest but steady. In Europe, particularly in West Germany, France, and the UK, the 130XE found a more enthusiastic market, helped by competitive pricing and Atari’s strong brand recognition.

Jack Tramiel’s acquisition of Atari generated significant media attention. Computer magazines such as BYTE, Compute!, and Popular Computing Weekly reported extensively on the new direction Atari was taking. Tramiel was often portrayed as a hard-driving businessman, known for his ruthless cost-cutting and competitive instincts. His leadership style was sometimes controversial, but few doubted his ability to revive a struggling company.

The industry press generally viewed the 130XE positively, highlighting:

  • The generous 128 KB of RAM.
  • Strong backward compatibility with earlier Atari 8-bit software.
  • Solid graphics and sound performance.
  • Competitive pricing under Tramiel’s aggressive business model.

However, critics noted that the 130XE, like other 8-bit computers of the time, was beginning to feel outdated in comparison to newer 16-bit machines, including Atari’s own 520ST, which was also released in 1985.

While the Atari 130XE represented the pinnacle of Atari’s 8-bit line, the broader industry was evolving rapidly. The introduction of affordable 16-bit computers — including the Atari ST series and the Commodore Amiga — began to capture consumer attention, pushing 8-bit systems into the background. By the late 1980s, Atari Corporation gradually phased out the 8-bit XE series to concentrate on its 16-bit ST line, which became Tramiel’s primary focus. The 130XE and its siblings remained on store shelves into the early 1990s, especially in Europe, before production eventually ceased.

The Legacy

The Atari 130XE occupies an interesting place in computing history. While it did not revolutionize the market, it extended the life of the Atari 8-bit platform and demonstrated Jack Tramiel’s pragmatic approach to computing: offering capable, affordable machines to the mass market. Today, retro computing enthusiasts value the 130XE for its expanded memory, compatibility, and understated design. It represents both the peak and the conclusion of Atari’s 8-bit home computer era — a testament to a time when Atari tried to redefine itself under new leadership. In the end, the Atari 130XE was a solid, capable machine that marked the end of one era for Atari and the beginning of another. Under Tramiel’s leadership, the company had shifted focus from gaming consoles to serious computing. Though the rise of 16-bit systems would eventually eclipse the XE line, the 130XE remains an important chapter in Atari’s storied history, standing as a symbol of resilience and reinvention.

 

 

 

Sinclair ZX-81

The £99 Revolution to homes:
Sinclair ZX80 & ZX81

In the late 1970s, as the personal computing revolution began to take shape, a British electronics firm known for calculators and audio equipment decided to enter a new frontier. Sinclair Radionics, under the leadership of Sir Clive Sinclair, transformed itself from a consumer electronics manufacturer into one of the most influential computer companies in the UK. This transformation led to the development of the ZX80 and ZX81 — two minimalist home computers that democratized computing for a generation. Sinclair Radionics, founded by Clive Sinclair in 1961, had built its reputation on affordable consumer electronics, including radios and pocket calculators. By the late 1970s, however, declining profits and intense competition in the calculator market forced Sinclair to seek new opportunities. Recognizing the potential of microcomputers, Clive Sinclair envisioned creating low-cost computers affordable enough for ordinary households. To realize this goal, he founded a new venture in 1979: Sinclair Research Ltd. This new company would focus exclusively on developing affordable personal computers, starting with what would become the ZX80.

Released in February 1980, the ZX80 was Britain’s first mass-market home computer and one of the first truly affordable personal computers in the world. It was named after the Z80 microprocessor (although technically it used a Z80-compatible NEC chip), which operated at 3.25 MHz. The ZX80 featured a mere 1 KB of RAM, though expansions up to 16 KB were available. It was revolutionary in its pricing: just £99.95 in kit form or £119.95 assembled. For the first time, a computer was cheaper than many contemporary calculators or televisions, opening up ownership to hobbyists and families. The ZX80 had a distinctive white plastic case with a membrane keyboard — more like a touchpad than traditional keys — and connected to a household television for display. It output monochrome text and rudimentary block graphics using a resolution of 32 columns by 24 rows. Programs were stored on standard cassette tapes. Sinclair’s engineers achieved such affordability by minimizing components and omitting features standard in more expensive computers. Notably, the ZX80 lacked hardware-based floating-point math and relied entirely on software for graphical display. As a result, the screen blanked whenever the CPU was processing input or running a program — a quirk users quickly became familiar with. While the ZX80’s power was limited, it attracted significant attention in both the UK and US. Magazines like Practical Computing and Your Computer praised it as a technological marvel, offering home computing at a previously unthinkable price point. Over 50,000 units of the ZX80 were sold before its successor was released — a significant figure for an early home computer, especially considering its lack of polish compared to more expensive rivals like the Apple II or Commodore PET.

The ZX81: Refining the Vision

Building on the success of the ZX80, Sinclair Research launched the ZX81 in March 1981. Retailing at £49.95 in kit form and £69.95 assembled, the ZX81 was even cheaper than its predecessor — an extraordinary achievement considering its improved functionality.

The ZX81 maintained the same core Z80 processor and minimalist design but introduced several key enhancements:

  • A redesigned black plastic case.
  • Improved power efficiency, reducing chip count with a custom ULA (Uncommitted Logic Array).
  • Introduction of SLOW and FAST display modes: in SLOW mode, the screen stayed visible during processing, albeit at reduced speed.
  • Floating-point math in BASIC, making calculations more practical.

With just 1 KB of onboard RAM, the ZX81 still relied on RAM packs for expansion, typically up to 16 KB. Programs were loaded from cassette tapes, and output remained black-and-white on a TV display.

Sales Success and Global Impact

The ZX81 was a phenomenal commercial success, selling around 1.5 million units worldwide. Sinclair marketed it effectively in both the UK and the United States, partnering with Timex to rebrand and distribute the ZX81 as the Timex Sinclair 1000 in North America.  In Britain, the ZX81 became many users’ first computer. Schools, hobbyists, and home users embraced the machine for programming in BASIC, gaming, and educational applications Throughout the early 1980s, Sinclair Research and Clive Sinclair himself gained significant media attention. Newspapers and magazines hailed Sinclair as a British innovator, likening him to Steve Jobs or Bill Gates. Nevertheless, the press agreed that Sinclair had transformed personal computing from a niche hobby into a mainstream phenomenon.

Competitors

In the UK, the main competitors were:

  • Commodore VIC-20, offering color graphics and better sound.
  • BBC Micro, more powerful but significantly more expensive.
  • Atari 400/800 series, providing advanced graphics but at a premium price.

Despite these rivals, Sinclair’s aggressive pricing strategy ensured dominance in the low-cost segment.

The Legacy

The ZX80 and ZX81 established Sinclair Research as Britain’s premier computer brand in the early 1980s. By breaking the price barrier, Sinclair introduced thousands of people to computing, many of whom went on to careers in the industry. These early machines paved the way for Sinclair’s most famous computer: the ZX Spectrum, released in 1982, which added color graphics and sound, solidifying Sinclair’s place in computing history. The ZX80 and ZX81 were not powerful or feature-rich computers, but they were revolutionary nonetheless. By focusing on cost and simplicity, Sinclair Research transformed the personal computer from an expensive luxury into a household device. Clive Sinclair’s vision and these two humble machines played a key role in bringing computing to the masses — making them icons in the history of technology.

Eventually, however, Sinclair’s fortunes faded. Intense competition from Commodore, Amstrad, and overseas manufacturers, combined with poor investments in other products (like the Sinclair C5 electric vehicle), led to financial trouble. In 1986, Sinclair’s computer business was sold to Amstrad.

Oric-1

Britain’s Forgotten Home Computer Challenger

The Oric-1 was a British home computer launched in 1983 by Tangerine Computer Systems, aiming to capitalize on the rapidly growing 8-bit computer market dominated by the Sinclair ZX Spectrum. Designed as an affordable yet slightly more advanced alternative to Sinclair’s offerings, the Oric-1 blended accessible hardware with features meant to attract hobbyists, gamers, and educational users.

The Oric-1 was based on the MOS Technology 6502A processor, running at 1 MHz, paired with either 16 KB or 48 KB of RAM. Its display capabilities were modest but respectable for its price range: text modes of 28 or 40 columns and basic graphics up to 240×200 pixels, with support for 8 colors. Sound output was handled by the AY-3-8912 chip, offering 3-channel audio—superior to the beeper sound of the ZX Spectrum.

The machine featured a full QWERTY keyboard, though early users criticized its spongy, rubber-like keys. Storage was cassette-based, standard for the time, and the machine offered compatibility with standard televisions for display output.

Unveiled in early 1983, the Oric-1 quickly caught public attention thanks to its competitive pricing and the promise of being a step up from Sinclair’s popular Spectrum. The press response was mixed: while magazines like Your Computer praised its potential and colorful output, early models suffered from reliability issues and bugs in the ROM, which led to criticism in publications such as Popular Computing Weekly.

Despite these teething problems, Oric International (formed after Tangerine was restructured) reported strong initial sales. Approximately 160,000 Oric-1 units were sold in the UK alone during its first year, and total worldwide sales reached around 350,000 units. The Oric-1 targeted the home education and gaming markets. Users enjoyed a growing library of simple games, educational software, and programming tools in BASIC. Its 3-channel sound capabilities appealed to budding musicians and hobbyist programmers who wanted to explore digital audio beyond the Spectrum’s limitations.

Competition and Decline

The Oric-1’s main rivals were the Sinclair ZX Spectrum and the Commodore 64. The Spectrum’s lower price and extensive software library gave it a substantial advantage in the UK market. Meanwhile, the Commodore 64 dominated internationally thanks to its superior graphics and sound capabilities. Oric attempted to follow up with the Oric Atmos in 1984, which addressed many hardware flaws and improved the keyboard. However, increased competition and the failure to break into international markets limited the company’s success. Oric International eventually went bankrupt in 1987. Though ultimately overshadowed, the Oric-1 is remembered as an ambitious attempt to diversify the British home computer scene. Its combination of 6502-based processing, decent graphics, and advanced sound made it a solid entry-level machine that remains fondly remembered by retro computing enthusiasts today.

 

Commodore Amiga 1000

Commodore Amiga 1000:
The Birth of a Multimedia Legend

The Commodore Amiga A1000, released in 1985, marked a revolutionary leap in personal computing. As the first model in the Amiga family, the A1000 introduced groundbreaking graphics, sound, and multitasking capabilities that were years ahead of its time. Often celebrated as the world’s first true multimedia computer, the Amiga A1000’s story is one of both innovation and missed opportunities. The Amiga A1000 was the result of work by Amiga Corporation, a small technology startup founded in 1982 by Jay Miner and several colleagues, many of whom had previously worked at Atari. Originally, the company sought to develop a next-generation game console called the Lorraine, but financial struggles led Amiga Corporation to shift focus towards creating a full-fledged personal computer. In 1984, Commodore International, recognizing the potential of Amiga’s technology, acquired the company. Commodore provided the necessary financial backing to complete development and bring the Amiga computer to market.

Technical Foundation

At the heart of the Amiga A1000 was the Motorola 68000 processor, running at 7.16 MHz (NTSC) or 7.09 MHz (PAL). It featured 256 KB of RAM as standard, expandable to 512 KB or more using internal and external expansions. What set the A1000 apart, however, was its custom chipset, known collectively as the Original Chip Set (OCS), consisting of Agnus, Denise, and Paula chips.

  • Agnus handled direct memory access (DMA) and controlled the blitter and copper co-processors for fast graphics operations.
  • Denise managed video output, enabling resolutions up to 640×512 (interlaced) and displaying up to 4096 colors in HAM (Hold-And-Modify) mode.
  • Paula controlled audio, delivering 4-channel stereo sound at up to 28 kHz, far superior to PC beeps and even rivaling some dedicated music synthesizers.

The A1000 also introduced the Amiga Operating System, combining a graphical user interface (Workbench) with a multitasking kernel (Exec), offering pre-emptive multitasking at a time when most personal computers could only run one task at a time.

Official Launch in 1985

The Amiga A1000 was officially unveiled on July 23, 1985, at a high-profile launch event held at the Lincoln Center in New York City. This was no ordinary product reveal — Commodore aimed to position the Amiga as not just a computer, but as a symbol of creativity and technological progress.

Two cultural icons helped highlight the Amiga’s creative capabilities:

  • Debbie Harry, lead singer of Blondie, was invited to demonstrate the A1000’s graphical potential, as digital artist Andy Warhol famously created a digital portrait of her live on stage using an Amiga A1000 and graphic software called ProPaint. Warhol, already known for embracing new technologies in art, showcased how the A1000 could serve as a tool for modern artists.

This event was not just about hardware specifications; it was about positioning the Amiga as the future of multimedia computing.

At launch, the Amiga A1000 received highly positive reviews from the computing press. Critics were astonished by its multimedia prowess and its pre-emptive multitasking capabilities. Magazines such as Byte, InfoWorld, and Compute! highlighted the A1000’s advanced architecture, with Byte famously stating that the Amiga “represents the first true multimedia computer”.

However, the press also noted the A1000’s relatively high price — around $1,295 USD for the base unit (without monitor or additional memory). This pricing placed it above many home computers but below professional workstations like the Apple Macintosh or IBM PC/AT.

One innovative but unusual design feature was the “kickstart” ROM system. Unlike most computers, the A1000 loaded part of its operating system from floppy disk into a special 256 KB writable memory area each time it booted. This allowed easy OS upgrades, but it slightly lengthened the startup process and made the machine dependent on the Kickstart disk.

Sales and Commercial Performance

Despite its technological strengths, the Amiga A1000 faced commercial challenges. Exact sales figures remain unclear, but estimates suggest that approximately 150,000 to 200,000 units were sold worldwide during its production run from 1985 to 1987. This made it more of a niche product compared to Commodore’s earlier success with the Commodore 64.

Several factors limited sales:

  • Commodore struggled to clearly define and market the A1000’s target audience, caught between the home computer and professional workstation markets.
  • The high initial price deterred average consumers.
  • Commodore’s marketing resources were split between the Amiga and its declining 8-bit product lines.
Competitors

The Amiga A1000 faced competition from multiple fronts:

  • Apple Macintosh: With its strong foothold in desktop publishing, the Mac was a key competitor in the creative professional market, though it lacked the multimedia power of the A1000.
  • IBM PC and Compatibles: Dominating the business sector, PCs offered familiarity and expandability, though they lagged significantly in graphics and audio capabilities.
  • Atari ST: Launched in 1985 shortly before the A1000, the Atari 520ST and 1040ST offered similar 16/32-bit power at a lower price, with built-in MIDI ports making the ST popular among musicians.

Ironically, the Atari ST was developed by Atari after Jay Miner’s former colleagues at Amiga had left the company, making it a direct competitor from a familiar source. The Amiga A1000’s production ended in 1987 as Commodore shifted focus to the more affordable Amiga 500 and professional-grade Amiga 2000. These models reached wider audiences, with the A500 becoming especially popular among home users and gamers. Although the A1000 itself was discontinued, its technological innovations laid the foundation for the entire Amiga platform, which would thrive through the late 1980s and early 1990s. Today, the A1000 is considered a collector’s item and a symbol of what might have been had Commodore more effectively marketed and developed the platform.

The Legacy

The Commodore Amiga A1000 stands as a milestone in personal computing history. Its pioneering multimedia architecture paved the way for modern digital content creation and interactive entertainment. While the A1000 itself was not a commercial blockbuster, it seeded a passionate community that sustained the Amiga platform long after Commodore’s eventual bankruptcy in 1994. For many enthusiasts and historians, the A1000 represents the moment when computers stopped being just office machines and began becoming creative tools — machines not only for work, but for art.

Atari 1040 ST

The Atari 1040ST, launched in 1986, was a popular 16-bit home computer known for its affordability and advanced graphics and sound. Powered by a Motorola 68000 CPU at 8 MHz and featuring 1 MB of RAM, it was widely used for music production thanks to built-in MIDI ports. Competing with the Commodore Amiga, the 1040ST earned praise for its fast graphical interface and productivity software, becoming a favourite among musicians and hobbyists throughout the late 1980s.

In 1986, Atari started promoting the Atari 1040ST, a home computer that would come to define a generation of 16-bit computing in Europe. Positioned as a powerful yet affordable alternative to the Commodore Amiga and IBM-compatible PCs, the 1040ST offered a combination of processing power, graphics capabilities, and multimedia potential that made it stand out in a crowded market. Its introduction marked a significant step forward from Atari’s earlier 8-bit computers, demonstrating how personal computing could be both accessible and technically sophisticated. At the heart of the 1040ST was the Motorola 68000 CPU, running at 8 MHz, paired with 1 MB of RAM—an impressive configuration for a home computer at the time. The system supported high-resolution monochrome and color graphics, with resolutions up to 640×400 in monochrome and 320×200 in color, making it suitable for both gaming and professional applications. Unlike many of its contemporaries, the 1040ST featured a built-in MIDI interface, which quickly made it a favorite among musicians and studios, demonstrating Atari’s foresight in recognizing the convergence of computing and creative work. The combination of power, expandability, and affordability positioned the 1040ST as a versatile machine for hobbyists, educators, and professionals alike.

One of the most compelling aspects of the Atari 1040ST was its balance between technical sophistication and user accessibility. The machine ran TOS (The Operating System) with the GEM graphical interface, providing a relatively intuitive environment for users transitioning from 8-bit computers or early DOS machines. For gamers, the 1040ST supported a growing library of titles that leveraged its graphics and sound capabilities. From arcade-style action games to complex strategy titles, developers were able to push the hardware to create engaging experiences that rivaled more expensive systems. Finnish users, along with other European enthusiasts, appreciated the machine’s ability to handle both work and play, making it a true all-in-one solution for home computing. The 1040ST’s influence extended far beyond gaming. Its MIDI capabilities and relatively low cost made it a standard in small music studios, educational institutions, and multimedia labs. Musicians could connect keyboards, synthesizers, and sequencers directly to the computer, using software for composition and performance—a remarkable capability in the mid-1980s. In Finland and across Europe, this feature introduced many users to digital music production, fostering creativity and technical skills simultaneously. The combination of powerful hardware, expandability, and built-in interfaces helped establish the 1040ST as a bridge between consumer computing and professional creative work. Despite its many strengths, the Atari 1040ST faced challenges. While it was more affordable than the Commodore Amiga, it lacked some of the advanced graphics and sound capabilities of its competitor, particularly in multimedia-rich applications and games. However, its performance, especially in productivity and music applications, often outweighed these limitations. Its influence on the European computer market cannot be overstated; it helped establish a standard for what a mid-range home computer could accomplish and inspired a generation of users to explore both programming and creative software.

In retrospect, the Atari 1040ST represents a critical juncture in the evolution of personal computing. It combined 16-bit processing power, professional-grade multimedia support, and user accessibility in a package that was both affordable and versatile. Finnish computer enthusiasts, musicians, and hobbyists embraced it as a machine that could handle a wide range of tasks, from gaming and programming to music composition and graphics work. The 1040ST’s legacy is reflected not only in its hardware achievements but also in its influence on a generation of creative and technical users who were introduced to computing through its accessible yet sophisticated platform. Ultimately, the Atari 1040ST stands as a testament to a period in computing history when innovation, versatility, and user engagement converged. It was a machine that could entertain, educate, and inspire, bridging the gap between hobbyist experimentation and professional creativity. Its enduring appeal lies in its ability to combine power with accessibility, demonstrating that a home computer could be both a tool for work and a source of imagination—a legacy that continues to resonate with enthusiasts and retro computing fans around the world.

Atari 400

Atari enters to home computing markets:
Introducing Atari 400

In 1979, Atari introduced the Atari 400, a home computer that would play a pivotal role in the early days of personal computing. Released alongside its sibling, the Atari 800, the 400 was designed as an approachable, family-friendly machine capable of gaming, education, and light productivity. While modest in specifications compared to later 16-bit systems, the Atari 400 represented a significant leap forward from early microcomputers, bringing color graphics, sound, and interactive software into the homes of a generation of users, including enthusiasts in USA and across Europe. The Atari 400 was powered by the MOS Technology 6502 CPU, running at 1.79 MHz, and offered 8 KB to 16 KB of RAM, expandable with cartridges. Graphics and sound were handled by custom co-processors: the CTIA/GTIA graphics chip provided sprite-based visuals, while the POKEY chip handled both sound generation and input devices. These dedicated chips allowed the 400 to deliver rich audiovisual experiences that set it apart from competitors, particularly in the realm of home gaming. The machine’s membrane keyboard, designed for durability and simplicity, made it approachable for children and novice users, though it was less comfortable for extended typing or programming sessions.

Gaming was a primary use for the Atari 400, and the system’s hardware capabilities enabled a wide range of experiences. Arcade-style titles, educational software, and text-based adventures all thrived on the 400’s platform. Developers leveraged the 400’s sprite graphics and sound capabilities to create engaging, visually appealing games that captivated users despite the machine’s limited memory. Finnish hobbyists and computer clubs embraced the Atari 400 for its ability to run both entertaining and educational programs, establishing it as a versatile machine for home use and early learning. One of the Atari 400’s strengths lay in its expandability and support for cartridges, which simplified software installation and expanded the machine’s capabilities. Users could insert cartridges for games, educational titles, or programming languages such as Atari BASIC, allowing immediate access without cumbersome tape or disk loading. The built-in BASIC interpreter encouraged experimentation and learning, enabling young users to create their own programs, explore computational logic, and develop problem-solving skills. This accessibility was key to the machine’s enduring appeal in both educational and hobbyist settings.

Despite its strengths, the Atari 400 had notable limitations. The membrane keyboard, while durable and child-friendly, was often criticized for its lack of tactile feedback, making extended typing or programming less comfortable. Its memory limitations constrained the complexity of software compared to machines like the Commodore 64, and early disk storage options were expensive and limited. Nevertheless, the 400’s affordability, simplicity, and rich audiovisual capabilities made it a strong entry-level home computer, especially for families and schools seeking an introduction to computing. Children, students, and hobbyists could explore programming, play games, and engage with technology in ways that were previously inaccessible. Its support for BASIC, along with a growing library of cartridges and educational titles, ensured that the 400 was not only entertaining but also a tool for skill development. Users could learn coding, experiment with graphics, and even begin designing games, fostering a generation of creative and technically literate individuals. Looking back, the Atari 400 represents a foundational moment in the evolution of personal computing. It combined approachability, audiovisual sophistication, and educational potential in a compact, affordable package, laying the groundwork for Atari’s subsequent 8-bit successes and influencing the home computer market broadly. Its impact extended beyond mere entertainment; it introduced users to programming, digital logic, and interactive software, shaping how a generation approached technology. The Atari 400 remains a symbol of early home computing innovation, a testament to the era when personal computers first entered living rooms and classrooms, inspiring creativity, learning, and imagination.

 

en_GBEnglish (UK)