U.S. export controls are complicated when it come to IT systems. Initially most computers were for business use. Then (in the WWII time-frame) the military applications became important. For a while, military and space program requirements drove innovation in IT performance and packaging, with most of the work in the U.S. done by private government contractors or on research grants, rather than civil servants. Still competition to meet government requirements tended to drive technical performance, and the underlying fundamental research was as much from DARPA funding, NASA, NSF, etc. as from direct corporate investment in big corporate research labs (e.g., Bell Labs, Xerox PARC, IBM & HP research labs). Export control regs. were then written assuming that aerospace IT would be cutting edge.
After the introduction of the Apple II (and other consumer/hobbyist systems like the Osbourne), things began to change. Unexpected markets developed. Workstations (Sun, Apollo, HP, etc.) began mass roll-outs of systems with processors optimized for a reduced instruction set computing (RISC). Soon workstations sold for non-classified applications were running rings around pricey mini-computers and commodity business mainframes. There was renaissance in new technology coming out corporate research labs (without government funding). The pace of change became such that by the time cumbersome military/aerospace procurement completed a purchase, the products delivered were already obsolete. (A NASA procurement program with rolling upgrades, SEWP, actually became the go to place for federal researchers to buy state-of-the-art hardware while still following US government procurement regs.) A foreign adversary might buy a non-military workstation (sold for financial services, for example) that could easily be converted to defense use AND run faster than the mil-spec. hardware in use by defense establishments in the Western Alliance. That put export control for "dual use" IT in a genuine no-win situation. Clamp down hard and the technology would just move elsewhere, putting US producers and military buyers at a serious competitive disadvantage.
Finally as the market for personal computers grew into sales of hundreds of millions of units, Moore's Law came to consumer commodity IT systems. As the feature size of processors and memory became smaller and smaller, the costs of a plant to produce them (fabs) grew almost as fast. Eventually, fabs became so expensive that specialized military and aerospace processors were priced out of the market for production lines with the smallest feature sizes. Only chips with consumer-sized potential markets could justify the frequent >$10 billion investments in upgraded fabrication lines. Server farms, built with (mostly) consumer hardware, came to out-perform purpose-design supercomputer hardware for all but a handful of specialized applications.
Many of the smaller chip factories closed, and small-market military and aerospace needs ended up being kind of ghetto for old, slow, fault-tolerant, and electromagnetic pulse resistant technology. At this point, there's still some specialized embedded control/guidance hardware that's worth doing export control. Much of the effort in the last decade or two has focused instead on controlling export of software with military applications. That said, the computer in your new smart phone, selling a ~billion units is now faster than anything used for defense purposes during the Cold War. Not to mention: most of the fastest and most energy efficient processors and memory chips are produced in China or Taiwan.