RSS
 

Posts Tagged ‘Technology’

IBM Researcher Explains What Makes Watson Tick [VIDEO]

18 Feb


Humanity took a beating from the machines this week. The world’s best Jeopardy player is no longer from the human race.

This week, IBM’s Watson supercomputer defeated Jeopardy‘s greatest champions, and it wasn’t even close. When all was said and done, Watson won $77,147, far more than Brad Rutter ($21,600) or 74-time champion Ken Jennings ($24,000). Its ability to dissect complex human language and return correct responses in a matter of seconds was simply too much for humanity’s best players.

A few years ago though, Watson couldn’t even answer 20% of the questions it was given correctly. And it took hours, not seconds, for Watson to process a question.

At an intimate event in San Francisco, John Prager, one of the researchers developing Watson’s ability to answer complex questions, gave a presentation detailing the work he and his colleagues did to turn Watson into a Jeopardy champion. During his presentation and a Q&A afterwards, Prager and fellow researcher Burn Lewis revealed some key nuggets of information, such as why Watson made those odd, uneven bets during Daily Doubles (an IBM researcher thought it would be boring if Watson’s bets ended with zeros, so he added random dollar amounts for kicks) or which programming languages the researchers used to build Watson (Java and C++).

So what’s next for Watson? Prager says that the next frontier is health care; he hopes that Watson’s technology can help diagnose ailments by analyzing vast quantities of data against patient symptoms and queries.

Check out the video to get a deeper dive into the technology behind Watson. Check it out in HD if you want to read the slides.

More About: IBM, IBM Watson, Jeopardy, video, watson, youtube

For more Tech & Gadgets coverage:

 

All the world’s computers equal to one human mind

16 Feb

The time may come when a computer will be able to out-compute a human, but not yet. According to a recent study, adding up all the computation power in every laptop, server, mainframe, cell phone, and digital processors of all kinds, everywhere on the planet will give you approximately the ability to handle approximately 6.4 x 10^18 operations a second. About the same as a human brain.

All the worlds storage – paper, film, hard-drives, etc. would give you same amount of storage as human DNA. In other words, somewhere around 2011 the planet has enough computing power to account for 1 extra person. The vast amount of “thinking” is still done by organic chemistry. Read the article I read on ARS Technica.

 
 

Why you should never, ever use two spaces after a period.

13 Jan
Last month, Gawker published a series of messages that WikiLeaks founder Julian Assange had once written to a 19-year-old girl he'd become infatuated with. Gawker called the e-mails "creepy," "lovesick," and "stalkery"; I'd add overwrought, self-important, and dorky. ("Our intimacy seems like the memory of a strange dream to me," went a typical line.) Still, given all we've heard about Assange's puffed-up personality, the substance of his e-mail was pretty unsurprising. What really surprised me was his typography.

[more ...]

Add to Facebook Add to Twitter Add to digg Add to Reddit Add to StumbleUpon Email this Article
 
 

Just Make It Faster

19 Dec

As a user, how often have you thought “I wish this web service was faster.”  As a CEO, how often have you said “just make it faster.”  Or, more simply, “why is this damn thing so slow?”

This is a not a new question.  I’ve been thinking about this since I first started writing code (APL) when I was 12 (ahem – 33 years ago) on a computer in the basement of a Frito-Lay data center in Dallas.

This morning, as part of my daily information routine, I came across a brilliant article by Carlos Bueno, an engineer at Facebook, titled “The Full Stack, Part 1.”  In it, he starts by defining a “full-stack programmer“:

“A “full-stack programmer” is a generalist, someone who can create a non-trivial application by themselves. People who develop broad skills also tend to develop a good mental model of how different layers of a system behave. This turns out to be especially valuable for performance & optimization work.”

He then dissects a simple SQL query (DELETE FROM some_table WHERE id = 1234;) and gives several quick reasons why performance could vary widely when this query is executed.

It reminded me of a client situation from my first company, Feld Technologies.  We were working on a logistics project with a management consulting firm for one of the largest retail companies in the world.  The folks from the management consulting firm did all the design and analysis; we wrote the code to work with the massive databases that supported this.  This was in the early 1990′s and we were working with Oracle on the PC (not a pretty thing, but required by this project for some reason.)  The database was coming from a mainframe and by PC-standards was enormous (although it would probably be considered tiny today.)

At this point Feld Technologies was about ten people and, while I still wrote some code, I wasn’t doing anything on this particular project other than helping at the management consulting level (e.g. I’d dress up in a suit and go with the management consultants to the client and participate in meetings.)  One of our software engineers wrote all the code.  He did a nice job of synthesizing the requirements, wrestling Oracle for the PC to the ground (on a Novell network), and getting all the PL/SQL stuff working.

We had one big problem.  It took 24 hours to run a single analysis.  Now, there was no real time requirement for this project – we might have gotten away with it if it took eight hours as we could just run them over night.  But it didn’t work for the management consultants or the client to hear “ok – we just pressed go – call us at this time tomorrow and we’ll tell you what happened.”  This was especially painful once we gave the system to the end client whose internal analyst would run the system, wait 24 hours, tell us the analysis didn’t look right, and bitch loudly to his boss who was a senior VP at the retailer and paid our bills.

I recall having a very stressful month.  After a week of this (where we probably got two analyses done because of the time it took to iterate on the changes requested by the client for the app) I decided to spend some time with our engineer who was working on it.  I didn’t know anything about Oracle as I’d never done anything with it as a developer, but I understood relational databases extremely well from my previous work with Btrieve and Dataflex.  And, looking back, I met the definition of a full-stack programmer all the way down to the hardware level (at the time I was the guy in our company that fixed the file servers when they crashed with our friendly neighborhood parity error or Netware device driver fail to load errors.)

Over the course of a few days, we managed to cut the run time down to under ten minutes.  My partner Dave Jilk, also a full-stack programmer (and a much better one than me), helped immensely as he completely grokked relational database theory.  When all was said and done, a faster hard drive, more memory, a few indexes that were missing, restructuring of several of the SELECT statements buried deep in the application, and a minor restructure of the database was all that was required to boost the performance by 100x.

When I reflect on all of this, I realize how important it is to have a few full-stack programmers on the team.  Sometimes it’s the CTO, sometimes it the VP of Engineering, sometimes it’s just someone in the guts of the engineering organization.  When I think of the companies I’ve worked with recently that are dealing with massive scale and have to be obsessed with performance, such as Zynga, Gist, Cloud Engines, and SendGrid I can identify the person early in the life of the company that played the key role. And, when I think of companies that did magic stuff like Postini and FeedBurner at massive scale, I know exactly who that full system programmer was.

If you are a CEO of a startup, do you know who the full-stack programmer on your team is?

 
 

Tim Wu on the new monopolists: a “last chapter” for The Master Switch

13 Nov
I reviewed Tim Wu's great history of media consolidation and regulatory capture The Master Switch earlier this month; now Tim says, "This piece I wrote for the Wall Street Journal is an important one. It is like a last chapter for my book."
We wouldn't fret over monopoly so much if it came with a term limit. If Facebook's rule over social networking were somehow restricted to, say, 10 years--or better, ended the moment the firm lost its technical superiority--the very idea of monopoly might seem almost wholesome. The problem is that dominant firms are like congressional incumbents and African dictators: They rarely give up even when they are clearly past their prime. Facing decline, they do everything possible to stay in power. And that's when the rest of us suffer.

AT&T's near-absolute dominion over the telephone lasted from about 1914 until the 1984 breakup, all the while delaying the advent of lower prices and innovative technologies that new entrants would eventually bring. The Hollywood studios took effective control of American film in the 1930s, and even now, weakened versions of them remain in charge. Information monopolies can have very long half-lives.

Declining information monopolists often find a lifeline of last resort in the form of Uncle Sam. The government has conferred its blessing on monopolies in information industries with unusual frequency. Sometimes this protection has yielded reciprocal benefits, with the owner of an information network offering the state something valuable in return, like warrantless wiretaps.

In the Grip of the New Monopolists

 

Feds admit to storing tens of thousands of images from naked scanners – unknown number leaked back to manufacturer

09 Nov
You know those naked scanners that we're seeing at the airport that use backscatter radiation to show snoopy security staff high-resolution detailed images of your genitals, breasts, etc? The ones that aren't supposed to be storing those images from your personal involuntary porn shoot?

Well, the US Marshals have just copped to storing over 35,000 of these personal, private images taken from a single courthouse scanner in Florida.

What's more, another machine used in a DC courthouse was returned to the manufacturer with an unspecified number of naked images on its hard drive.

A 70-page document (PDF) showing the TSA's procurement specifications, classified as "sensitive security information," says that in some modes the scanner must "allow exporting of image data in real time" and provide a mechanism for "high-speed transfer of image data" over the network. (It also says that image filters will "protect the identity, modesty, and privacy of the passenger.")

"TSA is not being straightforward with the public about the capabilities of these devices," Rotenberg said. "This is the Department of Homeland Security subjecting every U.S. traveler to an intrusive search that can be recorded without any suspicion--I think it's outrageous." EPIC's lawsuit says that the TSA should have announced formal regulations, and argues that the body scanners violate the Fourth Amendment, which prohibits "unreasonable" searches.

Feds admit storing checkpoint body scan images (Thanks, Master Pokes!)

 
 

Chrome Lets You Remove Your Flash and Have It, Too

04 Nov
John Gruber at Daring Fireball has a clever workaround for when you want to have Flash available on demand on a Mac, but don't want it installed by default in all your browsers. John formerly used ClickToFlash with Safari to let him selectively control which Flash content displayed; there's a similar add-on called Flashblock for Firefox. Instead, John removed Flash from the various plug-in directories shared by browsers. He notes that Web sites now feed him alternative content, like static ads, since his browser no longer pretends it can accept Flash only to ignore it. A YouTube extension forces HTML5-compatible video to load, too. When he needs Flash, John launches Google Chrome, which has integral Flash support (it can be disabled, but you can't whitelist or blacklist specific sites). When he's done, he quits Chrome to prevent Flash from chewing cycles in the background.

 
 

Nobel Worthy: Best Graphene Close-Ups

05 Oct
<< Previous | Next >>
Mmm, Graphene Cake

Sorry diamond lovers, but graphene is the most awesome form of carbon out there. Evidence: Andre Geim and Konstantin Novoselov, the two scientists who isolated one-atom-thick sheets of the stuff in 2004, won the Nobel Prize this morning -- netting themselves a pot of 10 million Swedish kroner (about $1.49 million).

Despite its razor-thin makeup, graphene is one of the strongest, lightest and most conductive materials known to humankind. It’s also 97.3 percent transparent, but looks really cool under powerful microscopes. We’ve corralled some of the best shots here, with a bonus video of graphene being punished by an electron beam.

Mmmm... Graphene Cake

Theoretical physicist Philip Russell Wallace predicted graphene’s existence in 1947, but it wasn’t until the 1960s that scientists began looking for it in earnest. Forty years later, researchers practically wrote off isolating single-layer graphene. If the hexagonal layers didn’t roll up into buckeyballs or nanotubes, so the thinking went, they’d disintegrate entirely.

Geim and Novoselov persisted, however, and figured out how to isolate it using objects common to any office: Scotch tape and graphite, which is found in pencil leads.

At the top-right of this image is a 10-micron-wide, 30-layer-thick slice of graphene sheets.

Image: Science

<< Previous | Next >>

See Also:

 
 

Is the web really dead?

17 Aug
Wired uses this graph to illustrate Chris Anderson and Michael Wolff's claim that the world wide web is "dead." ff_webrip_chart2.jpg Their feature, The Web is Dead. Long Live the Internet, is live at Wired's own website. Without commenting on the article's argument, I nonetheless found this graph immediately suspect, because it doesn't account for the increase in internet traffic over the same period. The use of proportion of the total as the vertical axis instead of the actual total is a interesting editorial choice.

You can probably guess that total use increases so rapidly that the web is not declining at all. Perhaps you have something like this in mind:

graph2.jpg

In fact, between 1995 and 2006, the total amount of web traffic went from about 10 terabytes a month to 1,000,000 terabytes (or 1 exabyte). According to Cisco, the same source Wired used for its projections, total internet traffic rose then from about 1 exabyte to 7 exabytes between 2005 and 2010.

So with actual total traffic as the vertical axis, the graph would look more like this.

3.jpg

Clearly on its last legs!

Assuming that this crudely renormalized graph is at all accurate, it doesn't even seem to be the case that the web's ongoing growth has slowed. It's rather been joined by even more explosive growth in file-sharing and video, which is often embedded in the web in any case.

Update: It's also worth adding that bandwidth, though an interesting measure of the internet's growth, isn't so good for measuring consumption. It doesn't map to time spent, work done, money invested, wealth yielded... Does 50MB of YouTube kitteh represent more meaningful growth than a 5MB Wired feature? And, as others point out in the comments, many of the new trends are still reliant on the web to work, especially social networking.



 
 

Arthur C. Clarke predicted satellite TV and GPS in the 40s and 50s

26 Jul
4830422022_2e3dfd5384_b.jpg

Above, a letter written by Arthur C. Clarke in 1956 predicting, quite accurately, aspects of the future of communications.

Link [via Letters of Note via dvice]