Cryptheory – Just Crypto

Cryptocurrencies are our life! Get an Overview of Market News

Fifth Generation of Computers

3 min read

[ad_1]

Earlier Generations of Computing

The first generation of computing is generally thought of as the “vacuum tube era.” These computers used large vacuum tubes as their circuits, and large metal drums as their memory. They generated a tremendous amount of heat and, as any computer professional can tell attest, this led to a large number of failures and crashes in the early years of computing. This first generation of computer lasted for sixteen years, between 1940 and 1956, and was characterized by massive computers that could fill an entire room. The most notable of these large, and yet quite basic, computers, were the UNIVAC and ENIAC models.

Second-generation computing was characterized by a switch from vacuum tubes to transistors, and saw a significant decrease in the size of computing devices. Invented in 1947, the transistor came to computers in 1956. Its popularity and utility in computing machines lasted until 1963, when integrated circuits supplanted them. However, transistors remain an important part of modern computing. Even modern-day Intel chips contain tens of millions of transistors – although microscopic in size, and not nearly as power-draining as their much earlier predecessors.

Between 1964 and 1971, computing began to take baby steps toward the modern era. During this third generation of computing, the semiconductor increased the speed and efficiency of computers by leaps and bounds, while simultaneously shrinking them even further in size. These semiconductors used miniaturized transistors which were much smaller than the traditional transistor found in earlier computers, and put them on a silicon chip. This is still the basis for modern processors, though on a much, much smaller scale.

In 1971, computing hit the big time: microprocessing. Microprocessors can be found in every single computing device today, from desktops and laptops to tablets and smartphones. They contain thousands of integrated circuits that are housed on a single chip. Their parts are microscopic, allowing one small processor to handle many simultaneous tasks at the same time with very little loss of processing speed or capacity.

Because of their extremely small size and large processing capacity, microprocessors enabled the home computing industry to flourish. IBM introduced the very first personal computer in 1981; three years later, Apple followed with its wildly successful Apple line of computers that revolutionized the industry and made the microprocessor industry a mainstay in the American economy.

Chip manufacturers like AMD and Intel sprouted up and flourished in Silicon Valley alongside established brands like IBM. Their mutual innovation and competitive spirit led to the most rapid advancement of computer processing speed and power in the history of computing; and enabled a marketplace that is today dominated by handheld devices which are infinitely more powerful than the room-sized computers of just a half-century ago.

Fifth Generation of Computing

Technology never stops evolving and improving, however. While the microprocessor has revolutionized the computing industry, the fifth generation of computer looks to turn the whole industry on its head once again. The fifth generation of computing is called “artificial intelligence,” and it is the goal of computer scientists and developers to eventually create computers than outsmart, outwit, and maybe even outlast their human inventors.

The fifth generation of computer has already beaten humans in a number of games – most notably a 1997 game of chess against the man who was then the game’s world champion. But where it can beat humans in very methodical gameplay, fifth generation computing lacks the ability to understand natural human speech and affectation. Artificial intelligence is not yet as intelligent as it needs to be in order to interact with its human counterparts and – more importantly – truly understand them.

But strides have been made. Many computers and smartphones on the market contain a rudimentary voice recognition feature that can translate human speech into text. However, they still require slow, very punctual dictation – otherwise words become jumbled or erroneous. And they’re still not receptive to human affectation which might indicate the needs for capital letters, question marks, or things such as bold and italicized type.

As microprocessors continue to increase their power by leaps and bounds, it will becoming possible for these hallmarks of artificial intelligence to become easier to develop and implement. It’s easy to underestimate the complexity of human language and patterns of communication, but the simple fact is that translating those things into raw computing power and ability requires a great deal of time and resources – in some cases, resources that have yet to be fully developed and put into a computer chip.

Ali Gheli

All content in this article is for informational purposes only and in no way serves as investment advice. Investing in cryptocurrencies, commodities and stocks is very risky and can lead to capital losses.
BlackRock (IBIT), the Grayscale Bitcoin Trust (GBTC), Fidelity (FBTC), Ark Invest/21Shares (ARKB), Bitwise (BITB), Franklin (EZBC), Invesco/Galaxy (BTCO), VanEck (HODL), Valkyrie (BRRR), WisdomTree (BTCW), Hashdex (DEFI)

Leave a Reply

Your email address will not be published. Required fields are marked *