Wednesday, November 16, 2011

AI, Are We There Yet?


By: Alon Cohen

Wow how times fly. I remember a discussion I had about 28 years ago as if it was yesterday. At the time, I was working on few computer programs that seem to need human intelligence. One was an automated device testing program that could tell a technician which board to replace based on anomalies in the way that device behaved. The second was a learning algorithm that had to play Tic-Tac- Toe by figuring out the rules of the game by itself (yes after I saw the movie war games in 1983)) and it triggered in me the thinking about what would it take to make a computer more human.
I was talking to few of my friends discussing what would it take to make a computer creative. My idea at that time was that scientists would probably be able to take a human Neuron (or few of them) and place them in a chip, and use that as an intuition co-processor.
Well it seems like the MIT scientists just made it happen. The MIT people did not really took a brain tissue but found a way to emulate the way a Neuron behaves using analog circuits on a CMOS chip. I guess it took more than few years to figure that one out. However, as it stands I can now envision a reality just as the one depicted in Asimov’s books where business entities (like US robotics) will own an artificial brain that will surpass any human or supper (digital) computer in existence, which will help them invent and solve problems not solvable by humans like time travel or teleportation.
As it stands, the human brain has 100,000,000,000 Neurons and about 100,000,000,000,000 Synapse connecting them. The cerebral cortex is the outermost part of the brain. It plays a key role in memory, attention, perceptual awareness, thought, language, and consciousness. The number of neurons in the cortex is estimated to be 11,000,000,000 so about 11,000,000,000,000 Synapses. Synapses are those specialized junctions through which neurons signal to each other.

Surprisingly it only took 400 transistors to create that artificial Synapse at MIT. Imagine a quad core Itanium chip with 2,000,000,000 transistors, take that level of technology and you can create a brain with 5,000,000 Synapses or an equivalent of 5,000 neurons give or take or hmm as smart as Pond Snail.
Since our brain has about 11,000,000,000,000 Synapse in our cortex, we need just about 2.2 Million more of those quad core chips to emulate the human cortex. Sounds scary large number but the truth is that it is not that far if you are an optimist.
If you apply Moore’s law to those numbers you get that in 30-40 years (2^21 = 2M) give or take we will have the capacity to replicate a human cortex that can work thousands of times faster than a human can.
It is not that far I can tell you that, specifically when we live in internet speeds and each New Year end before you even noticed it started. Now, way before those 30 years, say 15 years from now we will be able to compose an artificial brain equivalent in capacity to a dog cortex. I guess all we will need to do at that stage is hope that this technology will not bite us. If it will bite, it will help us realize the corrections we need to make, just in time, so that the first artificial human-level brains will do the right thing.

Tuesday, August 30, 2011

Science Fiction? Conspiracy theory? – You Decide

Science Fiction? Conspiracy theory? – You Decide
By: Alon Cohen

I will start and explain why I do not buy HP products for 10 years now. This is a story I have been telling every HP employee I have met at every trade show and every friend who asked me about HP that was willing to listen.
Maybe 10 years ago I bought an all in one HP printer. I really liked HP from my days working at an R&D lab during the eighties, where every really good measurement equipment was either Tektronics or HP. I specifically liked the HP200 serious of PC computers that were way ahead of their time and superior in all aspect (but the cost) to the flimsy PCs XT that came out from IBM and looked like crap back in 1986-1987. So naturally, when I came to staples around 2000 and saw the brand new HP printers I still carried with me that warm fuzzy feeling I had from the old HP days about those products.
Since I needed two printers, one for home and one the office, I thought to myself why not buy the same one? This way I will install the driver once on my laptop and be done with it. In theory a great idea, practically I discovered that HP is not HP anymore and that printer drivers, is not their forte to say the least. But you know what, this is not the point.
The point is that after a year, just a week or two after the warranty expired, both printers started to display a similar error message on the screen. Normally a person who buys one would not suspect a foul play, however when one printer did thousands of pages at the office and the other one did only few pages sitting idle at home, one starts to suspect. My suspicion was that HP allegedly inserted a time bomb in the printer software to make it looks like it is dead forcing consumers to buy new printers.

To be honest, I have no proof of that but I was unable to shake that feeling off over the years, specifically with all the other bad smell coming out of that company.

It did not end there, to add a sin to the crime, when I called HP's customer support they made me pay for the call and forced me to buy another cartridge saying the new spare one I had expired. Well clearly, it did not, but if you can squeeze few more bucks from a sucker why not.

At some point, the HP customer support agent felt he was on a roll, one sucker customer two bad printers, so he took it to the next step saying: “What if I will get you the newest fancy printer for only $200”. Well I was an idiot for buying the first time but hey I am not that bad. And so I said, “let me get back to you on that”.

The next day, I went to the store to compare prices and, lo and behold, it is cheaper at the store! This sealed the deal from my point of view. I have not touched HP ever again.

Was I correct about this alleged selling methodology? You will be the judge of that. However moving to now, I came across this WSJ article about how HP decided to commit suicide and how this whole move is so unclear to everyone in the industry and... bang, it all became clear to me.

I wouldn't be surprised if few years from now, it will come out, that competitors were able to prove what HP did use time bombs and gave HP’s board and management an ultimatum to stay out of the PC / Home Printers / Tablets space or face class action law suits with deep personal consequences. I am not sure how deep you are into conspiracy theories, but this explanation works for me, and personally, I would like to see them out of the game because of that incident.

Who knows, maybe Agilent who span off, back in the days, and took HP’s good measurement products, can re-take the HP name and revive it to glory once again.

Thanks
Alon

Friday, January 7, 2011

How to stop your Microwave clock from Blinking

By: Alon Cohen

Whenever I hear about a new programming language I cringe. Essentially programming languages are all the same… a group of commands in an allegedly human readable language that tells your computer or device how to respond to inputs like keys keyboard, voice, numbers, mouse and so on.

So why do we have so many programming languages? Well every so often, a computer science professor decides that all that was done so far could be done in a simpler way; for those of you who believe that, I suggest trying to program in “Scheme”. Too often, a software company from the north-west decides that the best way to sell new development tools would be to create a new sharper language. Or a company in California, decides that in order to make it harder for developers to move code from other platforms it is better to come up with their own version of objective C.

Generally speaking, all those "new" languages, in my mind, just slows down development of good code (as programmers are always new to the language) and it prevents the industry from building on existing foundations. Yet it sure does help few companies to strive.

Contrary to the above, sometimes a good idea appears. This time, I would not call it a programming language but more like a programming interface. It is not an entirely new concept and it was even used for programming of Lego Devices and even homegrown interactive telephony systems.

However, this new concept from a Rutgers Professor makes me optimistic and also proud as my two boys, Gal Cohen who directly works on this project as a Junior at Rutgers, and Roy Cohen a Sophomore in High School who helps the team with ongoing advice about the Arduino and Xbee communications.


So why am I optimistic? Well for once, I can see this saving significant amount of my time trying to convince the kids to program the “smart” phones to do X & Y (for instance try activating conditional call forwarding on iPhone). It will also save me time helping less technically oriented friends, to deal with devices from answering machine (yes some people still use those) to wireless printers.


Plus, I can clearly see how the Phone.com user interface for call handling rules (which I use every day) could become a part of that Scratch world, helping customers define, in a simpler way than available today, the different actions that will take place when a call comes in.