The Internet - The first Worldwide Tool of Unification ("The End of History")

" ... Now I give you something that few think about: What do you think the Internet is all about, historically? Citizens of all the countries on Earth can talk to one another without electronic borders. The young people of those nations can all see each other, talk to each other, and express opinions. No matter what the country does to suppress it, they're doing it anyway. They are putting together a network of consciousness, of oneness, a multicultural consciousness. It's here to stay. It's part of the new energy. The young people know it and are leading the way.... "

" ... I gave you a prophecy more than 10 years ago. I told you there would come a day when everyone could talk to everyone and, therefore, there could be no conspiracy. For conspiracy depends on separation and secrecy - something hiding in the dark that only a few know about. Seen the news lately? What is happening? Could it be that there is a new paradigm happening that seems to go against history?... " Read More …. "The End of History"- Nov 20, 2010 (Kryon channelled by Lee Carroll)

"Recalibration of Free Choice"– Mar 3, 2012 (Kryon Channelling by Lee Carroll) - (Subjects: (Old) Souls, Midpoint on 21-12-2012, Shift of Human Consciousness, Black & White vs. Color, 1 - Spirituality (Religions) shifting, Loose a Pope “soon”, 2 - Humans will change react to drama, 3 - Civilizations/Population on Earth, 4 - Alternate energy sources (Geothermal, Tidal (Paddle wheels), Wind), 5 – Financials Institutes/concepts will change (Integrity – Ethical) , 6 - News/Media/TV to change, 7 – Big Pharmaceutical company will collapse “soon”, (Keep people sick), (Integrity – Ethical) 8 – Wars will be over on Earth, Global Unity, … etc.) - (Text version)

“…5 - Integrity That May Surprise…

Have you seen innovation and invention in the past decade that required thinking out of the box of an old reality? Indeed, you have. I can't tell you what's coming, because you haven't thought of it yet! But the potentials of it are looming large. Let me give you an example, Let us say that 20 years ago, you predicted that there would be something called the Internet on a device you don't really have yet using technology that you can't imagine. You will have full libraries, buildings filled with books, in your hand - a worldwide encyclopedia of everything knowable, with the ability to look it up instantly! Not only that, but that look-up service isn't going to cost a penny! You can call friends and see them on a video screen, and it won't cost a penny! No matter how long you use this service and to what depth you use it, the service itself will be free.

Now, anyone listening to you back then would perhaps have said, "Even if we can believe the technological part, which we think is impossible, everything costs something. There has to be a charge for it! Otherwise, how would they stay in business?" The answer is this: With new invention comes new paradigms of business. You don't know what you don't know, so don't decide in advance what you think is coming based on an old energy world. ..."
(Subjects: Who/What is Kryon ?, Egypt Uprising, Iran/Persia Uprising, Peace in Middle East without Israel actively involved, Muhammad, "Conceptual" Youth Revolution, "Conceptual" Managed Business, Internet, Social Media, News Media, Google, Bankers, Global Unity,..... etc.)


German anti-hate speech group counters Facebook trolls

German anti-hate speech group counters Facebook trolls
Logo No Hate Speech Movement

Bundestag passes law to fine social media companies for not deleting hate speech

Honouring computing’s 1843 visionary, Lady Ada Lovelace. (Design of doodle by Kevin Laughlin)

Monday, May 20, 2013

Is computing speed set to make a quantum leap?

Quantum mechanics research could hold the key to a new generation of super-fast computers

The Guardian, The Observer, John Naughton, 18 May 2013

The Large Hadron Collider was built in the pursuit of pure science, but research
 into quantum mechanics might soon yield enormous benefits for computing.
Photograph: Rex Features

Our imagination is stretched to the utmost," wrote Richard Feynman, the greatest physicist of his day, "not, as in fiction, to imagine things which are not really there, but just to comprehend those things that are there." Which is another way of saying that physics is weird. And particle physics – or quantum mechanics, to give it its posh title – is weird to the power of n, where n is a very large integer.

Consider some of the things that particle physicists believe. They accept without batting an eyelid, for example, that one particular subatomic particle, the neutrino, can pass right through the Earth without stopping. They believe that a subatomic particle can be in two different states at the same time. And that two particles can be "entangled" in such a way that they can co-ordinate their properties regardless of the distance in space and time that separates them (an idea that even Einstein found "spooky"). And that whenever we look at subatomic particles they are altered by the act of inspection so that, in a sense, we can never see them as they are.

For a long time, the world looked upon quantum physicists with a kind of bemused affection. Sure, they might be wacky, but boy, were they smart! And western governments stumped up large quantities of dosh to enable them to build the experimental kit they needed for their investigations. A huge underground doughnut was excavated in the suburbs of Geneva, for example, and filled with unconscionable amounts of heavy machinery in the hope that it would enable the quark-hunters to find the Higgs boson, or at any rate its shadowy tracks.

All of this was in furtherance of the purest of pure science – curiosity-driven research. The idea that this stuff might have any practical application seemed, well, preposterous to most of us. But here and there, there were people who thought otherwise (among them, as it happens, Richard Feynman). In particular, these visionaries wondered about the potential of harnessing the strange properties of subatomic particles for computational purposes. After all, if a particle can be in two different states at the same time (in contrast to a humdrum digital bit, which can only be a one or a zero), then maybe we could use that for speeded-up computing. And so on.

Thus was born the idea of the "quantum computer". At its heart is the idea of a quantum bit or qubit. The bits that conventional computers use are implemented by transistors that can either be on (1) or off (0). Qubits, in contrast, can be both on and off at the same time, which implies that they could be used to carry out two or more calculations simultaneously. In principle, therefore, quantum computers should run much faster than conventional, silicon-based ones, at least in calculations where parallel processing is helpful.

For as long as I have been paying attention to this stuff, the academic literature has been full of arguments about quantum computing. Some people thought that while it might be possible in theory, in practice it would prove impracticable. But while these disputes raged, a Canadian company called D-Wave – whose backers include Amazon boss Jeff Bezos and the "investment arm" of the CIA (I am not making this up) – was quietly getting on with building and marketing a quantum computer. In 2011, D-Wave sold its first machine – a 128-qubit computer – to military contractor Lockheed Martin. And last week it was announced that D-Wave had sold a more powerful machine to a consortium led by Google and Nasa and a number of leading US universities.

What's interesting about this is not so much its confirmation that the technology may indeed be a practical proposition, though that's significant in itself. More important is that it signals the possibility that we might be heading for a major step change in processing power. In one experiment, for example, it was found that the D-Wave machine was 3,600 times faster than a conventional computer in certain kinds of applications. Given that the increases in processing power enabled by Moore's law (which applies only to silicon and says that computing power doubles roughly every two years) are already causing us to revise our assumptions about what computers can and cannot do, we may have some more revisions to do. All of which goes to prove the truth of the adage: pure research is just research that hasn't yet been applied.


Related Article:


No comments: