IBM, Mind control, implantable chip666
Mind control implantable chip
Nov. 3, 2013 The dawn of the computer-controlled human.
Intel Labs creating implantable Mark of the Beast Mind Control Processor Chip.
Scientists have been working on the fusion of human beings and computer-controlled machines known as singularity.
Intel Labs has refined this technology to be the heart and soul of the Mark of the Beast Surveillance System that will control the entire world. Obamacare calls for implanting microchips into humans.
RFID is calling the shots from your car to washers and dryers, coffee makers, etc,
This is being done to create a system that can include you or lock you out.
Participation in this exclusive club will require you willingly recieve the Mark of the Beast.
Neutrality will not be allowed.
Intel calls this The Tomorrow Project.
IBM was working on Project 666 in 1971.
Everyone everywhere is involved.
By 2020 you wont need a keyboard and mouse to control your computer, you will surf the web using nothing more than your brain waves.
Japan has gone way too far in this.
Big Brother wont be planting chips in your brain against your will.
666, antichrist, beast, vatican
HARBINGER WARNINGS - Isaiah 9 prophecy
When GOD destroys USA, you cant say He didnt WARN us!
Transhumanism, Forbidden Gates, AntiChrist Rising
Biometrics, Human Chipping, National ID card
Posted <*))))>< by
DAILY NEWS with prophetic analysis
Pretty much just about all of the pieces are set - the only thing left is for our Lord Jesus Christ to open the 1st seal(and all of the seals/trumpets/vials of wrath after that).
Rev 5:1 And I saw in the right hand of him that sat on the throne a book written within and on the backside, sealed with seven seals.
Rev 5:2 And I saw a strong angel proclaiming with a loud voice, Who is worthy to open the book, and to loose the seals thereof?
Rev 5:3 And no man in heaven, nor in earth, neither under the earth, was able to open the book, neither to look thereon.
Rev 5:4 And I wept much, because no man was found worthy to open and to read the book, neither to look thereon.
Rev 5:5 And one of the elders saith unto me, Weep not: behold, the Lion of the tribe of Juda, the Root of David, hath prevailed to open the book, and to loose the seven seals thereof.
'Biohacker' implants chip in arm
Kids, don't try this at home: A self-described "biohacker" had a big electronic chip almost as large as a deck of cards inserted beneath the skin of his arm. Without a doctor's help. And without anesthetics.
Tim Cannon is a software developer from Pittsburgh and one of the developers at Grindhouse Wetware, a firm dedicated to "augmenting humanity using safe, affordable, open source technology," according to the group's website. As they explain it, "Computers are hardware. Apps are software. Humans are wetware."
The device Cannon had inserted into his arm is a Circadia 1.0, a battery-powered implant that can record data from Cannon's body and transmit it to his Android mobile device. Because no board-certified surgeon would perform the operation, Cannon turned to a DIY team that included a piercing and tattoo specialist who used ice to quell the pain of the procedure. [Super-Intelligent Machines: 7 Robotic Futures]
Now that the device is inserted and functioning, Cannon is one step closer to achieving a childhood dream. "Ever since I was a kid, I've been telling people that I want to be a robot," Cannon told The Verge. "These days, that doesn't seem so impossible anymore."
The Circadia chip isn't particularly advanced: All it does is record Cannon's body temperature and transmit it to his cellphone over a Bluetooth connection. While this isn't a huge improvement over an ordinary thermometer how analog! it does represent one small step forward in what will undoubtedly be a continuing march toward greater integration of electronics and biology.
Cannon is hardly the first individual to have technology implanted into his or her body just ask former vice president Dick Cheney (who had a battery-powered artificial heart implanted), or any dog with a microchip.
Some are referring to biohacking as the next wave in evolution. "I think that's the trend, and where we're heading," according to futurist and sci-fi author James Rollins.
"There's a whole 'transhuman' movement, which is the merging of biology and machine," Rollins told LiveScience in an earlier interview. "Google Glass is one small step, and now there's a Japanese scientist who's developed the contact lens equivalent of Google Glass. And those are two things you put right on, if not in, your body. So I think we're already moving that way, and quite rapidly."
Cannon sees future refinements as being able to do more than just passively transmit information. "I think that our environment should listen more accurately and more intuitively to what's happening in our body," Cannon told Motherboard. "So if, for example, I've had a stressful day, the Circadia will communicate that to my house and will prepare a nice relaxing atmosphere for when I get home: dim the lights, [draw] a hot bath."
|Cannon sees future refinements as being able to do more than just passively transmit information. "I think that our environment should listen more accurately and more intuitively to what's happening in our body," Cannon told Motherboard. "So if, for example, I've had a stressful day, the Circadia will communicate that to my house and will prepare a nice relaxing atmosphere for when I get home: dim the lights, [draw] a hot bath." |
John_12:48 He that rejecteth me, and receiveth not my words, hath one that judgeth him: the word that I have spoken, the same shall judge him in the last day.
Brain-like computers are learning from experience
Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.
The first commercial version of the new kind of computer chip is scheduled to be released in 2014. Not only can it automate tasks that now require painstaking programming - for example, moving a robot's arm smoothly and efficiently - but it can also sidestep and even tolerate errors, potentially making the term "computer crash" obsolete.
The new processors consist of electronic components that can be connected by wires that mimic biological synapses.
The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information. It allows computers to absorb new information while carrying out a task, and adjust what they do based on the changing signals.
In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control. That can hold enormous consequences for tasks like facial and speech recognition, navigation and planning, which are still in elementary stages and rely heavily on human programming.
Designers say the computing style can clear the way for robots that can safely walk and drive in the physical world, although a thinking or conscious computer, a staple of science fiction, is still far off on the digital horizon.
"We're moving from engineering computing systems to something that has many of the characteristics of biological computing," said Larry Smarr, an astrophysicist who directs the California Institute for Telecommunications and Information Technology, one of many research centres devoted to developing these new kinds of computer circuits.
Conventional computers are limited by what they have been programmed to do. Computer vision systems, for example, only "recognise" objects that can be identified by the statistics-oriented algorithms programmed into them. An algorithm is like a recipe, a set of step-by-step instructions to perform a calculation.
But last year, Google researchers were able to get a machine-learning algorithm, known as a neural network, to perform an identification task without supervision. The network scanned a database of 10 million images, and in doing so trained itself to recognise cats.
In June, the company said it had used those neural network techniques to develop a new search service to help customers find specific photos more accurately.
The new approach, used in hardware and software, is being driven by the explosion of scientific knowledge about the brain. Kwabena Boahen, a computer scientist who leads Stanford's Brains in Silicon research program, said that that was also its limitation, as scientists are far from fully understanding how brains function.
"We have no clue," he said. "I'm an engineer, and I build things. There are these highfalutin theories, but give me one that will let me build something."
Until now, the design of computers was dictated by ideas originated by the physicist John von Neumann about 65 years ago. Microprocessors perform operations at lightning speed, following instructions programmed using long strings of 1s and 0s. They generally store that information separately in what is known, colloquially, as memory, either in the processor itself, in adjacent storage chips or in higher capacity magnetic disk drives.
The data - for instance, temperatures for a climate model or letters for word processing - are shuttled in and out of the processor's short-term memory while the computer carries out the programmed action. The result is then moved to its main memory.
The new processors consist of electronic components that can be connected by wires that mimic biological synapses. Because they are based on large groups of neuron-like elements, they are known as neuromorphic processors, a term credited to the California Institute of Technology physicist Carver Mead, who pioneered the concept in the late 1980s.
They are not "programmed". Rather the connections between the circuits are "weighted" according to correlations in data that the processor has already "learned". Those weights are then altered as data flows in to the chip, causing them to change their values and to "spike". That generates a signal that travels to other components and, in reaction, changes the neural network, in essence programming the next actions much the same way that information alters human thoughts and actions.
"Instead of bringing data to computation as we do today, we can now bring computation to data," said Dharmendra Modha, an IBM computer scientist who leads the company's cognitive computing research effort. "Sensors become the computer, and it opens up a new way to use computer chips that can be everywhere."
The new computers, which are still based on silicon chips, will not replace today's computers but will augment them, at least for now.
Many computer designers see them as coprocessors, meaning that they can work in tandem with other circuits that can be embedded in smartphones and in the giant centralised computers that make up the cloud. Modern computers already consist of a variety of coprocessors that perform specialised tasks, like producing graphics on smartphones and converting visual, audio and other data for laptops.
One great advantage of the new approach is its ability to tolerate glitches. Traditional computers are precise, but they cannot work around the failure of even a single transistor. With the biological designs, the algorithms are ever changing, allowing the system to continuously adapt and work around failures to complete tasks.
Traditional computers are also remarkably energy inefficient, especially when compared with actual brains, which the new neurons are built to mimic.
IBM announced in 2012 that it had built a supercomputer simulation of the brain that encompassed roughly 10 billion neurons - more than 10 per cent of a human brain. It ran about 1500 times more slowly than a brain. Further, it required several megawatts of power, compared with just 20 watts of power used by the biological brain.
Running the program, known as Compass, which attempts to simulate a brain, at the speed of a human brain would require a flow of electricity in a conventional computer that is equivalent to what is needed to power both San Francisco and New York, Modha said.
IBM and Qualcomm, as well as the Stanford research team, have already designed neuromorphic processors, and Qualcomm has said that it is coming out in 2014 with a commercial version, which is expected to be used largely for further development. Moreover, many universities are now focused on this new style of computing. This fall, the National Science Foundation financed the Centre for Brains, Minds and Machines, a new research centre based at the Massachusetts Institute of Technology, with Harvard and Cornell.
The largest class on campus this fall at Stanford was a graduate level machine-learning course covering both statistical and biological approaches, taught by the computer scientist Andrew Ng. More than 760 students enrolled.
"That reflects the zeitgeist," said Terry Sejnowski, a computational neuroscientist at the Salk Institute, who pioneered early biologically inspired algorithms. "Everyone knows there is something big happening, and they're trying find out what it is."
Revelation 13:14 And deceiveth them that dwell on the earth by the means of those miracles which he had power to do in the sight of the beast; saying to them that dwell on the earth, that they should make an image to the beast, which had the wound by a sword, and did live.
Rev 13:15 And he had power to give life unto the image of the beast, that the image of the beast should both speak, and cause that as many as would not worship the image of the beast should be killed.
* Posted by BA
Govt is messing with the human brain
Aug. 2012 They Really Do Want To Implant Microchips Into Your Brain
Are you ready to have a microchip implanted into your brain? That might not sound very appealing to you at this point, but this is exactly what the big pharmaceutical companies and the big technology companies have planned for our future. They are pumping millions of dollars into researching "cutting edge" technologies that will enable implantable microchips to greatly "enhance" our health and our lives. Of course nobody is going to force you to have a microchip implanted into your brain when they are first introduced. Initially, brain implants will be marketed as "revolutionary breakthroughs" that can cure chronic diseases and that can enable the disabled to live normal lives. When the "benefits" of such technology are demonstrated to the general public, soon most people will want to become "super-abled". Just imagine the hype that will surround these implants when people discover that you can get rid of your extra weight in a matter of days or that you can download an entire college course into your memory in just a matter of hours. The possibilities for this kind of technology are endless, and it is just a matter of time before having microchips implanted into your brain is considered to be quite common. What was once science fiction is rapidly becoming reality, and it is going to change the world forever.
But aren't there some very serious potential downsides to having microchips implanted into our brains? Of course there are.
Unfortunately, this technology is not as far off as you might think, and most people are not even talking about what the negative consequences might be.
According to a recent article in the Financial Times, the pharmaceutical company of the future will include a "bioelectronics" business that "treats disease through electrical signalling in the brain and elsewhere."
Diseases such as diabetes and epilepsy and conditions such as obesity and depression will be will be treated "through electronic implants into the brain rather than pills or injections."
These implants will send electrical signals to cells and organs that are "malfunctioning". People will be totally "cured" without ever having to pop a pill or go under the knife.
It sounds too good to be true, right?
Well, the Financial Times says that British pharmaceutical giant GlaxoSmithKline is working very hard to develop these kinds of technologies. Moncef Slaoui, the head of research and development at GlaxoSmithKline, says that the "challenge is to integrate the work – in brain-computer interfaces, materials science, nanotechnology, micro-power generation – to provide therapeutic benefit."
If a brain implant could cure a disease that you have been suffering from your whole life would you take it?
A lot of people are going to be faced with that kind of a decision in future years.
And this kind of technology is advancing very rapidly. In fact, some researchers have already had success treating certain diseases by implanting microchips into the brains of rats. The following is from a recent Mashable article....
Stroke and Parkinson’s Disease patients may benefit from a controversial experiment that implanted microchips into lab rats. Scientists say the tests produced effective results in brain damage research.
Rats showed motor function in formerly damaged gray matter after a neural microchip was implanted under the rat’s skull and electrodes were transferred to the rat’s brain. Without the microchip, rats with damaged brain tissue did not have motor function. Both strokes and Parkinson’s can cause permanent neurological damage to brain tissue, so this scientific research brings hope.
In addition, the U.S. government has been working on implantable microchips that would monitor the health of our soldiers and enhance their abilities in the field.
So this technology is definitely coming.
But it must be very complicated to get a microchip implanted into your brain, right?
Actually it is fairly simple.
According to an article in the Wall Street Journal, the typical procedure is very quick and it often only requires just an overnight stay in the hospital....
Neural implants, also called brain implants, are medical devices designed to be placed under the skull, on the surface of the brain. Often as small as an aspirin, implants use thin metal electrodes to "listen" to brain activity and in some cases to stimulate activity in the brain. Attuned to the activity between neurons, a neural implant can essentially "listen" to your brain activity and then "talk" directly to your brain.
If that prospect makes you queasy, you may be surprised to learn that the installation of a neural implant is relatively simple and fast. Under anesthesia, an incision is made in the scalp, a hole is drilled in the skull, and the device is placed on the surface of the brain. Diagnostic communication with the device can take place wirelessly. When it is not an outpatient procedure, patients typically require only an overnight stay at the hospital.
But is it really safe to have a device implanted into your head that can "talk" directly to your brain?
Many large corporations are banking on the fact that in a world that is always hungry for new technology that most people will not be bothered by such things.
For example, Intel is working on sensors that will be implanted in the brain that will be able to directly control computers and cell phones. The following is an excerpt from a Computer World UK article....
By the year 2020, you won't need a keyboard and mouse to control your computer, say Intel researchers. Instead, users will open documents and surf the web using nothing more than their brain waves.
Scientists at Intel's research lab in Pittsburgh are working to find ways to read and harness human brain waves so they can be used to operate computers, television sets and cell phones. The brain waves would be harnessed with Intel-developed sensors implanted in people's brains.
The scientists say the plan is not a scene from a sci-fi movie, Big Brother won't be planting chips in your brain against your will. Researchers expect that consumers will want the freedom they will gain by using the implant.
Once again, this is not something that will be forced on you against your will.
These big corporations are banking on the fact that a lot of people will want to get these brain implants.
Even now, some video game makers are developing headsets that allow users to play games using their brain waves rather than a joystick or a control pad.
Other companies want to make it possible to directly connect your brain to the Internet.
As I have written about previously, IBM is aggressively working to develop this kind of technology. The following is from a recent IBM press release....
IBM scientists are among those researching how to link your brain to your devices, such as a computer or a smartphone. If you just need to think about calling someone, it happens. Or you can control the cursor on a computer screen just by thinking about where you want to move it.
Scientists in the field of bioinformatics have designed headsets with advanced sensors to read electrical brain activity that can recognize facial expressions, excitement and concentration levels, and thoughts of a person without them physically taking any actions.
The potential "benefits" of such technology are almost beyond imagination. An article on the website of the Science Channel put it this way....
If you could pump data directly into your gray matter at, say, 50 mbps — the top speed offered by one major U.S. internet service provider — you’d be able to read a 500-page book in just under two-tenths of a second.
How would the world change if you could download a lifetime of learning directly into your brain in a matter of weeks?
The possibilities are endless. But so is the potential for abuse.
Implantable microchips that can "talk" directly to the brain would give a tyrannical government the ultimate form of control.
If you could download thoughts and feelings directly into the brains of your citizens, you could achieve total control and never have to worry that they would turn on you.
In fact, you could potentially program these chips to make your citizens feel good all the time. You could have these chips produce a "natural high" that never ends. That would make your citizens incredibly dependent on the chips and they would never want to give them up.
This kind of technology has the potential to be one of the greatest threats to liberty and freedom in the history of mankind.
At first these implantable microchips will be sold to us as one of the greatest "breakthroughs" ever, but in the end they could end up totally enslaving us.
So I will never be taking any kind of a brain implant, and I hope that you will not either.
* Posted by BA
Obama administration plans decade-long effort to map the brain
Feb. 2013 The Obama administration is planning a decade-long scientific effort to examine the workings of the human brain and build a comprehensive map of its activity, seeking to do for the brain what the Human Genome Project did for genetics.
The project, which the administration has been looking to unveil as early as March, will include federal agencies, private foundations and teams of neuroscientists and nanoscientists in a concerted effort to advance the knowledge of the brain's billions of neurons and gain greater insights into perception, actions and, ultimately, consciousness.
Scientists with the highest hopes for the project also see it as a way to develop the technology essential to understanding diseases such as Alzheimer's and Parkinson's, and to find new therapies for a variety of mental illnesses.
Moreover, the project holds the potential to pave the way for advances in artificial intelligence.
The project, which could ultimately cost billions of dollars, is expected to be part of the president's budget proposal next month. And four scientists and representatives of research institutions said they had participated in planning for what is being called the Brain Activity Map project.
The details are not final, and it is not clear how much federal money would be proposed or approved for the project in a time of fiscal constraint or how far the research would be able to get without significant federal financing.
In his State of the Union address, President Barack Obama cited brain research as an example of how the government should "invest in the best ideas."
"Every dollar we invested to map the human genome returned $140 to our economy — every dollar," he said. "Today, our scientists are mapping the human brain to unlock the answers to Alzheimer's. They're developing drugs to regenerate damaged organs, devising new materials to make batteries 10 times more powerful. Now is not the time to gut these job-creating investments in science and innovation."
Story Landis, the director of the National Institute of Neurological Disorders and Stroke, said that when she heard Obama's speech, she thought he was referring to an existing National Institutes of Health project to map the static human brain.
"But he wasn't," she said. "He was referring to a new project to map the active human brain that the NIH hopes to fund next year."
Indeed, after the speech, Francis Collins, the director of the NIH, may have inadvertently confirmed the plan when he wrote in a Twitter message: "Obama mentions the #NIH Brain Activity Map in #SOTU."
A spokesman for the White House Office of Science and Technology Policy declined to comment about the project.
The initiative, if successful, could provide a lift for the economy.
"The Human Genome Project was on the order of about $300 million a year for a decade," said George Church, a Harvard University molecular biologist who helped create that project and said he was helping to plan the Brain Activity Map project. "If you look at the total spending in neuroscience and nanoscience that might be relative to this today, we are already spending more than that. We probably won't spend less money, but we will probably get a lot more bang for the buck."
Scientists involved in the planning said they hoped that federal financing for the project would be more than $300 million a year, which if approved by Congress would amount to at least $3 billion over the 10 years.
The Human Genome Project cost $3.8 billion. It was begun in 1990, and its goal, the mapping of the complete human genome, or all the genes in human DNA, was achieved ahead of schedule, in April 2003. A federal government study of the impact of the project indicated that it returned $800 billion by 2010.
The advent of new technology that allows scientists to identify firing neurons in the brain has led to numerous brain-research projects around the world. Yet the brain remains one of the greatest scientific mysteries.
Composed of roughly 100 billion neurons that each electrically "spike" in response to outside stimuli, as well as in vast ensembles based on conscious and unconscious activity, the human brain is so complex that scientists have not yet found a way to record the activity of more than a small number of neurons at once, and in most cases that is done invasively with physical probes.
But a group of nanotechnologists and neuroscientists say they believe that technologies are at hand to make it possible to observe and gain a more complete understanding of the brain, and to do it less intrusively.
In June in the journal Neuron, six leading scientists proposed pursuing a number of new approaches for mapping the brain.
One possibility is to build a complete model map of brain activity by creating fleets of molecule-size machines to noninvasively act as sensors to measure and store brain activity at the cellular level. The proposal envisions using synthetic DNA as a storage mechanism for brain activity.
"Not least, we might expect novel understanding and therapies for diseases such as schizophrenia and autism," wrote the scientists, who include Church; Ralph Greenspan, the associate director of the Kavli Institute for Brain and Mind at the University of California, San Diego; A. Paul Alivisatos, the director of the Lawrence Berkeley National Laboratory; Miyoung Chun, a molecular geneticist who is the vice president for science programs at the Kavli Foundation; Michael Roukes, a physicist at the California Institute of Technology; and Rafael Yuste, a neuroscientist at Columbia University.
The Obama proposal seems to have evolved in a manner similar to the Human Genome Project, scientists said.
"The genome project arguably began in 1984, where there were a dozen of us who were kind of independently moving in that direction but didn't really realize there were other people who were as weird as we were," Church said.
Intel wants to plug a smartphone into your brain
May 3, 2012 - Show off a new gadget to your friends or family and inevitably one person in the group will declare,
"Soon they'll just plug these things directly into your brain!" And everyone will laugh, as if they've never heard that joke before.
Rom 3:19 Now we know that what things soever the law saith, it saith to them who are under the law: that every mouth may be stopped, and all the world may become guilty before God.
Rom 3:20 Therefore by the deeds of the law there shall no flesh be justified in his sight: for by the law is the knowledge of sin.
Rom 3:21 But now the righteousness of God without the law is manifested, being witnessed by the law and the prophets;
Rom 3:22 Even the righteousness of God which is by faith of Jesus Christ unto all and upon all them that believe: for there is no difference:
Rom 3:23 For all have sinned, and come short of the glory of God;
Scientists grow 'mini human brains' from stem cells
Aug 2013 The lab-grown "cerebral organoids" may help researchers determine the causes of brain disorders, such as schizophrenia and autism.
LONDON — Scientists have grown the first mini human brains in a laboratory and say their success could lead to new levels of understanding about the way brains develop and what goes wrong in disorders like schizophrenia and autism.
Researchers based in Austria started with human stem cells and created a culture in the lab that allowed them to grow into so-called "cerebral organoids" — or mini brains — that consisted of several distinct brain regions.
It is the first time that scientists have managed to replicate the development of brain tissue in three dimensions.
Using the organoids, the scientists were then able to produce a biological model of how a rare brain condition called microcephaly develops — suggesting the same technique could in future be used to model disorders like autism or schizophrenia that affect millions of people around the world.
"This study offers the promise of a major new tool for understanding the causes of major developmental disorders of the brain ... as well as testing possible treatments," said Paul Matthews, a professor of clinical neuroscience at Imperial College London, who was not involved in the research but was impressed with its results.
Zameel Cader, a consultant neurologist at Britain's John Radcliffe Hospital in Oxford, described the work as "fascinating and exciting". He said it extended the possibility of stem cell technologies for understanding brain development and disease mechanisms - and for discovering new drugs.
Although it starts as relatively simple tissue, the human brain swiftly develops into the most complex known natural structure, and scientists are largely in the dark about how that happens.
This makes it extremely difficult for researchers to gain an understanding of what might be going wrong in — and therefore how to treat — many common disorders of the brain such as depression, schizophrenia and autism.
GROWING STEM CELLS
To create their brain tissue, Juergen Knoblich and Madeline Lancaster at Austria's Institute of Molecular Biotechnology and fellow researchers at Britain's Edinburgh University Human Genetics Unit began with human stem cells and grew them with a special combination of nutrients designed to capitalize on the cells' innate ability to organize into complex organ structures.
They grew tissue called neuroectoderm — the layer of cells in the embryo from which all components of the brain and nervous system develop.
Fragments of this tissue were then embedded in a scaffold and put into a spinning bioreactor — a system that circulates oxygen and nutrients to allow them to grow into cerebral organoids.
After a month, the fragments had organized themselves into primitive structures that could be recognized as developing brain regions such as retina, choroid plexus and cerebral cortex, the researchers explained in a telephone briefing.
At two months, the organoids reached a maximum size of around 0.16 inches, they said. Although they were very small and still a long way from resembling anything like the detailed structure of a fully developed human brain, they did contain firing neurons and distinct types of neural tissue.
"This is one of the cases where size doesn't really matter," Knoblich told reporters.
"Our system is not optimized for generation of an entire brain and that was not at all our goal. Our major goal was to analyze the development of human brain (tissue) and generate a model system we can use to transfer knowledge from animal models to a human setting."
In an early sign of how such mini brains may be useful for studying disease in the future, Knoblich's team were able to use their organoids to model the development of microcephaly, a rare neurological condition in which patients develop an abnormally small head, and identify what causes it.
Both the research team and other experts acknowledged, however, that the work was a very long way from growing a fully-functioning human brain in a laboratory.
"The human brain is the most complex thing in the known universe and has a frighteningly elaborate number of connections and interactions, both between its numerous subdivisions and the body in general," said Dean Burnett, lecturer in psychiatry at Cardiff University.
"Saying you can replicate the workings of the brain with some tissue in a dish in the lab is like inventing the first abacus and saying you can use it to run the latest version of Microsoft Windows — there is a connection there, but we're a long way from that sort of application yet."
Romans 11:33 O the depth of the riches both of the wisdom and knowledge of God! how unsearchable are his judgments, and his ways past finding out!
Rom 11:34 For who hath known the mind of the Lord? or who hath been his counsellor?
Rom 11:35 Or who hath first given to him, and it shall be recompensed unto him again?
Rom 11:36 For of him, and through him, and to him, are all things: to whom be glory for ever. Amen.
Method of recording brain activity could lead to mind-reading devices, Stanford scientists say
8 Dec 2013 A brain region activated when people are asked to perform mathematical calculations in an experimental setting is similarly activated when they use numbers — or even imprecise quantitative terms, such as “more than”— in everyday conversation, according to a study by Stanford University School of Medicine scientists.
Using a novel method, the researchers collected the first solid evidence that the pattern of brain activity seen in someone performing a mathematical exercise under experimentally controlled conditions is very similar to that observed when the person engages in quantitative thought in the course of daily life.
“We’re now able to eavesdrop on the brain in real life,” said Josef Parvizi, MD, PhD, associate professor of neurology and neurological sciences and director of Stanford’s Human Intracranial Cognitive Electrophysiology Program. Parvizi is the senior author of the study, to be published in Nature Communications. The study’s lead authors are postdoctoral scholar Mohammad Dastjerdi, MD, PhD, and graduate student Muge Ozker.
The finding could lead to “mind-reading” applications that, for example, would allow a patient who is rendered mute by a stroke to communicate via passive thinking. Conceivably, it could also lead to more dystopian outcomes: chip implants that spy on or even control people’s thoughts.
“This is exciting, and a little scary,” said Henry Greely, JD, the Deane F. and Kate Edelman Johnson Professor of Law and steering committee chair of the Stanford Center for Biomedical Ethics, who played no role in the study but is familiar with its contents and described himself as “very impressed” by the findings. “It demonstrates, first, that we can see when someone’s dealing with numbers and, second, that we may conceivably someday be able to manipulate the brain to affect how someone deals with numbers.”
The researchers monitored electrical activity in a region of the brain called the intraparietal sulcus, known to be important in attention and eye and hand motion. Previous studies have hinted that some nerve-cell clusters in this area are also involved in numerosity, the mathematical equivalent of literacy.
However, the techniques that previous studies have used, such as functional magnetic resonance imaging, are limited in their ability to study brain activity in real-life settings and to pinpoint the precise timing of nerve cells’ firing patterns. These studies have focused on testing just one specific function in one specific brain region, and have tried to eliminate or otherwise account for every possible confounding factor. In addition, the experimental subjects would have to lie more or less motionless inside a dark, tubular chamber whose silence would be punctuated by constant, loud, mechanical, banging noises while images flashed on a computer screen.
“This is not real life,” said Parvizi. “You’re not in your room, having a cup of tea and experiencing life’s events spontaneously.” A profoundly important question, he said, is: “How does a population of nerve cells that has been shown experimentally to be important in a particular function work in real life?”
His team’s method, called intracranial recording, provided exquisite anatomical and temporal precision and allowed the scientists to monitor brain activity when people were immersed in real-life situations. Parvizi and his associates tapped into the brains of three volunteers who were being evaluated for possible surgical treatment of their recurring, drug-resistant epileptic seizures.
The procedure involves temporarily removing a portion of a patient’s skull and positioning packets of electrodes against the exposed brain surface. For up to a week, patients remain hooked up to the monitoring apparatus while the electrodes pick up electrical activity within the brain. This monitoring continues uninterrupted for patients’ entire hospital stay, capturing their inevitable repeated seizures and enabling neurologists to determine the exact spot in each patient’s brain where the seizures are originating.
During this whole time, patients remain tethered to the monitoring apparatus and mostly confined to their beds. But otherwise, except for the typical intrusions of a hospital setting, they are comfortable, free of pain and free to eat, drink, think, talk to friends and family in person or on the phone, or watch videos.
The electrodes implanted in patients’ heads are like wiretaps, each eavesdropping on a population of several hundred thousand nerve cells and reporting back to a computer.
In the study, participants’ actions were also monitored by video cameras throughout their stay. This allowed the researchers later to correlate patients’ voluntary activities in a real-life setting with nerve-cell behavior in the monitored brain region.
As part of the study, volunteers answered true/false questions that popped up on a laptop screen, one after another. Some questions required calculation — for instance, is it true or false that 2 + 4 = 5? — while others demanded what scientists call episodic memory — true or false: I had coffee at breakfast this morning. In other instances, patients were simply asked to stare at the crosshairs at the center of an otherwise blank screen to capture the brain’s so-called “resting state.”
Consistent with other studies, Parvizi’s team found that electrical activity in a particular group of nerve cells in the intraparietal sulcus spiked when, and only when, volunteers were performing calculations.
Afterward, Parvizi and his colleagues analyzed each volunteer’s daily electrode record, identified many spikes in intraparietal-sulcus activity that occurred outside experimental settings, and turned to the recorded video footage to see exactly what the volunteer had been doing when such spikes occurred.
They found that when a patient mentioned a number — or even a quantitative reference, such as “some more,” “many” or “bigger than the other one” — there was a spike of electrical activity in the same nerve-cell population of the intraparietal sulcus that was activated when the patient was doing calculations under experimental conditions.
That was an unexpected finding. “We found that this region is activated not only when reading numbers or thinking about them, but also when patients were referring more obliquely to quantities,” said Parvizi.
“These nerve cells are not firing chaotically,” he said. “They’re very specialized, active only when the subject starts thinking about numbers. When the subject is reminiscing, laughing or talking, they’re not activated.” Thus, it was possible to know, simply by consulting the electronic record of participants’ brain activity, whether they were engaged in quantitative thought during nonexperimental conditions.
Any fears of impending mind control are, at a minimum, premature, said Greely. “Practically speaking, it’s not the simplest thing in the world to go around implanting electrodes in people’s brains. It will not be done tomorrow, or easily, or surreptitiously.”
Parvizi agreed. “We’re still in early days with this,” he said. “If this is a baseball game, we’re not even in the first inning. We just got a ticket to enter the stadium.”
Man distracted by cellphone dies walking off a cliff in San Diego while
by Next Media Videos 0:35 mins
A man who was so distracted by his cellphone didn’t see a 60-foot drop in front of him and tumbled to his death off a California cliff on Christmas Day.
How technology has transformed classes, costs and social lives for college students
Bill Dameron started college in 1981. His mother, he remembers, dropped him at the curb of North Carolina’s Greensboro College with just a sack of clothes and a clock radio.
They said goodbye without ceremony. Dameron recalls that she shouted, “Good luck!” while “never removing the cigarette from her mouth as she sped off.”
Thirty years later, he bid farewell to his step-daughter, Meghan, on her first day at Colby-Sawyer College in New London, N.H., in 2011. The Damerons wouldn’t find hasty goodbyes in the school’s syllabus. Meghan's school developed a seven-hour program, as Dameron describes it, “to ease the transition for students and parents.” While he and his husband thought that went overboard, he noticed other parents and their kids took full advantage, posting farewell photos to Facebook.
“There were tears, there were hugs and there was Wi-Fi,” Dameron, 50, writes in a first-person account.
Times have changed, indeed.
When he was a student, Dameron didn’t see his mom again until Christmas — “she complained about the beard I had grown,” he notes — while Meghan and her step-father are basically still, albeit virtually, living together as a family. Every day, he can check her grades, read school news and deposit money onto her cash card.
That generational difference — how today’s technology can keep families closer — is one theme Yahoo News explored for our “Born Digital” series. We invited current students and parents of students to write about how the college experience has changed over the past few decades. Check out excerpted responses below and join us for a live hangout Thursday with Bill and some of the other writers. Check the Yahoo News Google+ page for more information coming this week and join the conversation on Twitter using #BornDigital.
‘Students never see their teachers face to face’
When my mom went to college the first time, it was 1988. She was married with three children, so she took some correspondence courses through Liberty University.
"Distance learning is much different from what is it now," she told me. "Instead of using computers, we were sent a bunch of VHS tapes with all the lectures on them. They were so boring, I used to fall asleep trying to listen to them! And we didn't have the convenience of emailing our professors if we had a problem; we had to call to get help."
The process has been completely different for me; just filling out the application to enroll is different. My mother told me she applied to Liberty through the phone, while I applied via online application. Filling out schedules, meeting teachers, and even some advising has been Internet-based so far. (VHS tapes, of course, aren't sent.) All my classes are online, and, in some cases, students never see their teachers face to face.
— Jessica Starks, 18, will study English at Itawamba Community College in Mississippi this fall.
For communicating with the outside world, there was a black dial-up pay phone in the hall, but who to call? Cell phones were decades in the future, so the pay phone was mainly for making a collect call home. Otherwise, we had envelopes, postage stamps, and stationery. The only computer we'd ever seen was this massive IBM mainframe where we struggled with impossible Fortran language, punching stackable binary cards that might as well have been hieroglyphics.
The library was installing microfiche readers and Xerox copiers, which felt like "Futurama" compared to carbon paper and the Dewey decimal system. For term papers, the lucky ones like me had an Olivetti-Underwood manual typewriter in a zipped leather carry case gifted from proud parents.
— Laurie Jo Miller Farr, 57, earned her bachelor’s and master’s degrees from Northeastern University in Boston in 1975. Her children, Christina and Timothy, are enrolled at University College London and the University of Nottingham in the UK.
I rarely go to the undergraduate library. Why bother? I can access all the documents I need from the Internet, the web offers all relevant study material, and if I require a book, I can use Google Books or download an e-book to my laptop. Better yet, the cloud often means I don't need my laptop. On one side of the campus, I can upload an e-book to Google Drive, bike to the other side of campus, and print portions of that same book on another computer — if a hard copy is needed at all.
— Phillip Wachowiak is a 19-year-old junior at the University of Michigan, where he studies biomolecular science. He plans to graduate in 2015 and then attend medical school.
Cell phones have to be the worst attention-stealing culprit in the classroom. While I try to ignore mine, I have grad school classmates who still can't stop texting during seminars. In a class of 20 undergraduates, I'd say at least four are probably paying attention to their phone at any given time, whether it be for text messaging, checking Facebook, or playing games. That means that about a fifth of the class is paying tuition to play with electronic devices and miss most of the lecture.
— Lisa Fulgham, a 24-year-old graduate student in English at Mississippi State University, hopes to finish this December.
‘College loans carry some serious interest’
Each time I tell my dad how much money I'll need for my books for the semester, he gets this sour look on his face.
He never fails to remind me (or to complain to me, rather) that the money I spend on books would have been more than enough to cover his entire tuition when he attended the University of Minnesota back in 1970. Parents always say things like that.
So I did a little research on his school website to uncover some real numbers. Lo and behold, the tuition for the 1969-70 school year was $399!
Last spring, my bookstore total came $576.47 — surprisingly one of the lower bills I've paid. To this day, I am still mad about a required book for my environmental science class, which was so cleverly titled "Environmental Science." I paid $135 for a used copy and did not use it once.
— Dana Perry, 23, attends the University of Maryland Baltimore County and will graduate this year with a degree in gender and women's studies.
College loans carry some serious interest for those like me who did not receive significant academic scholarships and had to finance their education largely through debt. For each of my four years as an undergraduate, I borrowed around $20,000 per year to pursue my work in business school.
I was a very average student in high school, but quickly realized that this was no longer an option at the price I was paying at Fairfield. I would love to now be a debt-free graduate student pursuing a career in public accounting; however, knowing that I truly had to make my college experience all it could be gave me exactly the kick start that I needed.
— Mark Evans, 22, will graduate in May from Fairfield University in Connecticut with a master’s degree in accounting.
A college education is easier to complete now than it was in the 1980s, primarily due to changes in financial aid. I had to drop out of Northeastern University at age 19, with a 3.2 GPA, when I ran out of money in my freshman year. My daughter, Diana, is now 22 and a senior at Lyndon State College, even though I could not afford to send her there.
The good news is that the federal [loan] limits are now more reasonable — a whopping $57,500 for an undergraduate like her! As she is attending a school that costs about $7,500 annually, this means that she does not have to quit due to lack of money, like I did.
The changes in federal financial aid have made obtaining a college degree more attainable for those who could not otherwise afford it. Diana will be graduating next year, with terrific prospects at finding a job in her field. Yes, she will have to pay back the student loans, but at least she will be able to afford to.
— D.M. Cogger, 47, began school at Northeastern University in 1984.
‘I pity them for those lost opportunities’
When I headed off to SUNY-Oneonta for college in upstate New York 32 years ago, I was young, excited and completely in the dark about what I was doing and what was in store for me.
Not so for my 18-year-old son Henry, who begins his college career at Boston University this fall to study physics. He took so many Advanced Placement and International Baccalaureate classes in high school that he starts his freshman year with 84 college credits. I had three.
— Julie BoydCole, 50, studied communications at SUNY-Oneonta in 1981.
My professors can also ask more of me because of the technological tools that are readily available at my fingertips. For example, my professors will often write exams that will require the use of statistical software during the test. Professors will often design labs or workshop sessions in class that require a research tool to be learned and utilized in order to complete an assignment.
Consequently, technology is a Catch-22 in higher education today: It makes the completion of academic work swifter and more efficient, but it also raises expectations about the amount of work that should be completed by students and the quality of the work that is produced.
— Stetson Thacker, 21, is studying biology, chemistry and English at Denison University in Ohio. He will graduate in May.
My 18-year-old daughter, Vivian, who is enrolled at the University of Cincinnati, listens sympathetically to reminiscences of my four years on that campus in the 1980s, imagining pioneer days of taking classes without cell phones, the Internet, or computers.
The tedious process of research certainly has improved, as Internet searches beat poring through tomes in UC's Langsam Library. Vivian and her peers can complete their projects without leaving the dorm, or while sipping frappes at Starbucks.
Today's students are deprived of the pleasure of learning outside of the assigned topic and discoveries of music and literature, which become wonderful lifelong diversions. I pity them for those lost opportunities.
— Doug Poe, 50, attended the University of Cincinnati in the 1980s.
‘Keep doors open and all four feet on the floor’
The strict rules governing our social lives were positively convent-like when compared to campus life today. How differently we were treated back then as freshmen living in single-sex dormitories patrolled by upper-class floor captains.
Only during prescribed social hours were our male friends permitted to visit Smith Hall. We had a common room for socializing and an attendant posted at the front desk with a sign-in/sign-out book and a pamphlet of dorm rules including, "Keep doors open and all four feet on the floor."
The dorm floors were patrolled. Yes, Deborah, Jim, Jeff, and I got in trouble, returning from a McDonald's double date after lock-up at 11 p.m. That meant an appearance before the governing parietal committee and a lockdown. Mind you, I am describing college, not prison.
Boyfriends and pot could, and did, get the Boston Police to your door, but provisions for cigarette smoking were commonplace, with ashtrays placed everywhere.
— Laurie Jo Miller Farr
Compared to my parents, our physical social lives have suffered in favor of an online-physical hybrid. Instead of going to the library now, my roommates and I sit together in the living room — on our laptops. If I can't hang out with a group of friends because I have work to do, we will Snapchat each other our activities all night.
I don't have much time for dating, but I've noticed how couples spend less time speaking and more time on their phones when they're in line at Starbucks or at the movie theater.
— Phillip Wachowiak
‘There is no curfew, Mom’
We expect so much more from our kids today academically, but we let them slide dangerously in other areas. My son can speak Japanese, but he can't seem to grasp the concept that "cleaning up the dishes" includes the pots on the stove and the cutting board on the island. Of course, that won't be a problem at college — he has a meal plan.
Parents are more involved in their kids’ school lives these days. There are more teacher conferences and parents are encouraged to monitor their children's grades online. The result is more knowledge, but less experience. Kids don't get to be responsible for themselves.
— Kim Jacobs Walker, 48, graduated from the University of Texas at Austin in 1997, where she received a bachelor’s in Plan II with a concentration in English. Her son, Devon, 18, will attend Austin College in Sherman, Texas, this fall.
In 2008, during Krissy's senior year in high school, we toured the University of Miami. I was taken aback as we passed students sprawled on the lawn in bikinis, sun-bathing while studying. That would have caused quite a commotion back in my day. The campus police would have declared this indecent exposure! Very risqué!
The day I moved my daughter into her dorm was an education in itself. At 57, I consider myself pretty "hip" and open-minded. I gasped when I realized I was moving my freshman daughter into a coed dorm. My mouth flew open as I watched in amazement young men stroll past females in the halls. During my college days, no freshman ever slept in a building with the opposite sex. That was a privilege reserved for upperclassmen.
A bit shaken, I asked Krissy when her curfew was. She looked at me like I was from the Stone Age.
"There is no curfew, Mom."
— Rhonda Manning, 57, graduated in 1977 from Hampton University.
‘Competition is cutthroat’
Today's digitized, fast-changing world validates a prediction that then California Gov. Jerry Brown made in 1981 to one of my college classes. He said that 25 percent of us would work in careers that hadn't been invented yet, so focusing on a major wasn't as important as a well-rounded, diverse education.
Now that I have a son in college, a daughter in high school, and Jerry Brown is the governor of California, again, I use Brown's prophetic and wise remarks to illustrate the need for a broad education and an open mind about your future to my kids.
After all, who knows what changes and new jobs the next 30 years will bring?
— Dyanne Weiss, 54, graduated from CSU Northridge in California in 1982. Her 19-year-old son studies screenwriting at Emerson College in Boston.
Twenty-five years ago, one didn't always need a college degree to find a career, become successful, or even amass great wealth. Then, employers focused on personality, interview skills, connections, and skill in a particular field. My father earned his bachelor's degree in business administration from Old Dominion University in 1988, but he has since worked his way up in the Virginia Beach City Public School's education system through hard work, a good attitude, and his skills. I mean, it makes sense: In our parent's generation (and today), most of what you need to know is taught on the job.
But nowadays the focus seems to have shifted from personal attributes and skill to the number of awards and degrees a student has gained. A bachelor's degree is expected of most students entering the job market, and many need further schooling just to stay competitive. And that competition is cutthroat.
— Nicole Woodhouse, 21, will be a senior this fall at the College of William and Mary in Virginia, where she is majoring in psychology and biology.
In 1974, my mom enrolled at Sam Houston with one main purpose: earn her degree in sociology and become a social worker. Back then, going to college wasn't something that seemingly everyone did after high school. Many of her friends found good-paying jobs right after graduating, meaning you only went to college if you knew you needed a certain education for a certain career.
Now, about the only job opportunities you can get right out of high school are fast food and low-level retail, and both of those are minimum-wage with almost no room for promotion or a sizable pay increase. A college education has become less of an option and more of a necessity.
College has almost become the "new high school."
— Thomas Leger, 20, will graduate with a degree in philosophy from Sam Houston State University in Huntsville, Texas, next summer.
Quantum Internet On The Way
April 13th, 2012 Scientists Establish First Working Quantum Network.
Max Planck Institute quantum computing quantum internet
With amount of components we can cram on a chip slowly reaching its physical limit, quantum has become the next big thing that could revolutionize the computing world. IBM is even on the cusp of building actual quantum computer protoypes. But what good is any of that if we don’t have a quantum Internet? Fortunately, we do. A team of scientists at the Max Planck Institute of Quantum Optics have just established the first working quantum network.
As with most firsts, the network is quite primitive at the moment, and connects a mere two atomic nodes with 60 meters of fiber-optic cable. Since a quantum computer works with qubits, which can have values of 0, 1, or a quantum superposition of both, a quantum Internet would have to able to communicate in qubits. This prototype network accomplishes that by using photons to carry the information around.
The team has managed to rig up a laser to fire and hit the first networked atom in a way that the atom preserves its quantum state, but also produces a photon with that information plastered onto it. The photon then shoots off down the fiber optic cable delivering it to the second atom. Network achieved. On top of that, the researchers managed to get the two networked atoms to entangle, which means the network should be completely scalable to something along the lines of an Internet.
Doctors seek help on cancer treatment from IBM supercomputer
2013 - IBM's Watson supercomputer has beaten expert "Jeopardy" quiz show contestants, and its predecessor defeated a world chess champion. Now, doctors hope it can help them outsmart cancer.
Oncologists at two medical groups have started to test IBM's Watson's supercomputer system in an effort to improve speed and efficacy of treatments, the company said on Friday.
The Maine Center for Cancer Medicine and Westmed Medical Group will begin testing an application based on Watson's cognitive computing to help diagnose lung cancer and recommend treatment, IBM said.
"Access to comprehensive care can be difficult in rural areas such as southern Maine," said Tracey Weisberg, medical oncology president at Maine Center for Cancer Medicine and Blood Disorders.
"This allows the most comprehensive evidence based treatment we could have only dreamed of in the past," she added.
Watson is an artificial intelligence super computer system named after legendary International Business Machines President Thomas Watson.
Thanks to its computing power Watson can sift through 1.5 million patient records and histories to provide treatment options in a matter of seconds based on previous treatment outcomes and patient histories.
It has been fed with more than 600,000 pieces of medical evidence, 2 million pages of text from 42 medical journals and clinical trials in the area of oncology research, IBM said.
In addition, IBM partnered with clinicians and technology experts from health insurer WellPoint and Memorial Sloan-Kettering Cancer Center who spent thousands of hours to teach Watson how to process, analyze and interpret the meaning of complex clinical information, IBM said.
"Every doctor knows they cannot keep up with hundreds of new articles but every physician wants to be right and this is a way of facilitating that," said Samuel Nussbaum, chief medical officer at WellPoint.
IBM first showcased Watson's powers almost two years ago.
The computer beat two human competitors on the popular U.S. quiz show "Jeopardy!" highlighting the progress people have made in making machines able to think like them.
IBM has since further advanced Watson's linguistic and analytical abilities to develop new products such as medical diagnosis
2Th 2:11 And for this cause God shall send them strong delusion, that they should believe a lie:
2Th 2:12 That they all might be damned who believed not the truth, but had pleasure in unrighteousness.
Rev 13:18 Here is wisdom. Let him that hath understanding count the number of the beast: for it is the number of a man; and his number is Six hundred threescore and six.
1Cor 15:50 Now this I say, brethren, that flesh and blood cannot inherit the kingdom of God; neither doth corruption inherit incorruption.
IBM to fire 14,000
Mar 12, 2016 - IBM is sending more jobs to India. 14,000 IBMers worldwide could leave under the latest redundancy programme. In the UK alone, Big Blue has put 1,352 people at risk with 185 to be told of their fate by April.
OneHealth - the Beast Health system
IBM, Mind control, implantable chip
IBM was working on Project 666 way back in 1971
666, Bitcoin, Mandatory chip everyone in ObamaCare
Scientists Push To Implement Edible RFID Tracking Chips in Food
Biometrics, Human Chipping, National ID card
OneHealth 2012 Meeting Seattle, APHL, FDA, RFID, 666
Nanotechnology, nanobots, DNA * 666 chip
IBM to HIRE 25,000!
Dec 14, 2016 - BIG BLUE for YOU! AMERICA GREAT AGAIN!
IBM unveils plan to hire 25,000 in US on eve of Trump meeting.
IBM said they will hire 25,000 people over the next 4 years, about 6,000 of those in 2017.