Another hot new startup that has recently come out of stealth mode is SambaCloud. The founders Ian Howells and Razmik Abnous bring many years of content management experience from companies like Documentum and, in the open source space, Alfresco.
When I first heard about SambaCloud, I was a bit skeptical. But as soon as I saw the demo, it was quite clear that SambaCloud does something that no one else does. While there are lots of content management software companies and quite a few cloud storage companies, SambaCloud brings a much needed fresh perspective on some age-old problems that vex most organizations. Namely, how do you get people sharing the right information?
SambaCloud enables you to easily set up different channels of content so that you can monitor, share and collaborate with others. These channels can be built from different types of content ranging from external newsfeeds to internal documents and presentations. That may sound like a minor detail, but it's a huge breakthrough in transforming content management from something that sounds as fun as a tax audit into something that's as easy to use as Facebook or FlipBoard.
Despite the fact that SambaCloud is a relatively young company, they've also come out of the gate with not only an easy-to-use Web application, but also mobile versions for iPhone and iPad. Initially, SambaCloud is targeting sales & marketing teams, but ultimately I think there are probably dozens if not hundreds of potential use cases.
I picked up a Kindle 3 (aka Kindle Keyboard) a while back and have been thoroughly impressed with it. I love the fact that I can take half a dozen or more books with me when I travel without taking up a lot of weight or space in my luggage. While I wish the Kindle used an open standard like ePub rather than Mobi file format, given the large variety of books and reasonable prices, I can live with it. The only thing I don't like is that the Kindle is essentially a closed system and not very customizable. Nonetheless, there's a healthy community of open source hacks out there.
If you have a Kindle and are tired of the standard screensaver images of authors, you can install a Kindle screensaver hack with a couple of downloads. Just make sure you install the right version for your particular Kindle device.
IBM's Watson natural language Question & Answer system made headlines recently with its primetime debut on Jeopardy. Despite a few embarassing answers, Watson trounced top Jeopardy players Brad Rutter and Ken Jennings. Watson is built from 90 IBM Power 750 IBM Linux servers with 16 terabytes of memory providing 80 Teraflops of processing power. Watson is perhaps the most famous "Big Data" systems out there. Watson's knowledge base consists of 200 million pages of text data that is pre-processed using Hadoop and uses 4 terabytes of on-disk storage. What makes Watson unique is its ability to process questions in realtime assigning confidence levels to its answers. While Watson's not necessarily true machine intelligence, Watson does a good job of demonstrating how computers can complement human intelligence.
Stephen Baker, former writer at BusinessWeek, was on hand observing the team and has written Final Jeopardy to chronicle IBM's efforts. The book was published before Watson's appearance on Jeopardy and then finished in the days that followed on. I did a short Q&A interview with author Stephen Baker to get his take on this project.
I had finished The Numerati and was back at BusinessWeek, looking for the next book project. BusinessWeek was in the process of dying, (Bloomberg bought the remains) and I had requested to be let go in the transition. A few weeks before that came, I was having lunch at IBM and heard about the Watson project. It seemed like a dream for me: a story to tell with a championship match at the end, and really interesting computer issues to boot. My only fear was that some other writer must already be writing the book. I was enormously relieved when I found this wasn't true.
Q. What was the biggest surprise in covering this story?
That computer scientists can be so utterly captivated by language. You know, a lot of us divide the world into halves, the number people and the word people. I sometimes fall into this. But when you spend time with a team that's trying to train a machine to make sense of English, you see computer scientists dissecting the language with a precision that probably surpasses the most dedicated (and neurotic) English Phds at Ivy League schools. And what's especially interesting is that they cannot afford to focus on theory. They have to study the statistics of how we actually communicate--and then use them to program the computer.
Q. Given IBM's involvement were you able to tell the story you wanted to tell, or was there an approval process on what you wrote?
I could write whatever I wanted. They didn't demand any pre-approval. (Jeopardy actually appeared more concerned about these issues, and I had to send them a long list of "facts" about Jeopardy that appeared in the book. No such issues with IBM. I think they had confidence in the machine, and even if it lost in the final match, which was always a concern, readers would see a team at IBM taking computing to a new level.
Q. Cynics might argue that Watson's ability to deal with Jeopardy questions is really little more than a parlor trick, akin to old school interactive fiction games like Infocom's Zork or Hitchhiker's Guide to the Galaxy and not a true measure of intelligence or perhaps not even useful. What's your perspective on this?
Look, the Jeopardy game was a contrivance, and IBM chose the game in part because it came with a national audience with millions of viewers. That much is given. What's more, you could argue that it is not a test of intelligence, since the machine doesn't really understand the answers it's bringing back, and cannot draw conclusions from them. But I'd say to look at the results. IBM's computers struggled four years ago to answer much more ordinary natural-language questions in annual competitions. Their machine got about one of three right, and they were among the top performers. Building a Jeopardy machine was a huge advance in the domain of question-answering. As far as the usefulness, having machines "read" millions of documents and bring back answers, each one scored for its level of confidence, will be extremely useful--and even disruptive--in a number of industries. Whether or not the technology comes from IBM or its competitors, or whether the Watson platform itself will play a role, is still open to question.
Q. As I read "Final Jeopardy" I'm reminded occasionally of Tracy Kidder's 1981 book "Soul of a new machine." Of course that's quite a long time ago now and the computer they build has less power to it than my iPhone 4. But Data General was an underdog at that time and the Eagle project was essential for the company to survive. In the case of "Final Jeopardy" how do you create drama around a company as established and successful as IBM?
Most of the drama centered around whether the IBM team could take a dumb machine that played Jeoaprdy at the level of a fifth grader and turn it into a champion--and then whether or not it could actually win. So while Soul of a New Machine was a corporate drama, this was a little closer to sports. There was also some drama in the conflicts between IBM and Jeopardy, which was basically a tug of war between science and Hollywood.
Q. What was your schedule like to finish the book?
I got contract for the book on January 26, 2010. I agreed to deliver the first third to my editor by the end of June, the second third by the end of September, and the rest--minus the final chapter--by Nov. 7. During much of the time I was rewriting the beginning and reporting the end at the same time. All of those chapters went into production in early December. I reported on the final match on Jan. 14 and wrote the final chapter over the following weekend. The partial ebook came out on Jan 26, 2011, one year to the day after receiving the contract. So it was a quick turnaround. I would expect that schedules like that will become the norm. The book industry has to speed up.
Q. How did you feel about releasing the eBook version on Amazon before the final chapter was finished?
I was happy to release the ebook early. It was new, and it wasn't seamless. Some of the people who bought the partial ebook didn't get the final chapter until a few days after the match. But I would imagine that those who read through Chapter 10 before the match might have enjoyed the show more, since they knew the cast of characters--including the computer. What's more, I think more books are going to be published this way, in dribs and drabs. So it was nice to be the first.
Zack Urlocker is Chief Operating Officer at Zendesk, a cloud-based help desk provider. He was previously the Executive Vice President of Products at MySQL where he was responsible for Engineering and Marketing and helped grow the company to $100 million in revenue. Urlocker is an investor, advisor and board member to several software companies.
Last fall, before I joined Zendesk, I took a role as an Executive-in-Residence at Scale Venture Partners. A lot of people asked me about this, so I've written an article at GigaOm that describes my thought process and what I ended up working on.
While there are as many variations on the EIR position as there are venture firms, there are two flavors, generally speaking: Entrepreneur-in-Residence and Executive-in-Residence. Most firms have some experience with Entrepreneur-in-Residence programs. Essentially, they give office space, coffee and food to a proven entrepreneur so he or she can spend a few months researching or prototyping a new product or service...
I’m not the inventor type, and I didn’t want to just “hang out” at a venture firm to look at their portfolio companies, so I proposed doing a specific research assignment with Scale Venture Partners. Scale typically invests in late-stage Series B or Series C companies, which made it a good fit. I am good at scaling companies, but I don’t know that I’m any better at predicting which Series A investments will thrive than anyone else.
The key area of my research was to analyze some specific developments around Software-as-a-Service (SaaS), big data and NoSQL. This entailed studying the technologies in this area, understanding the optimal use cases and speaking to a broad range of customers and prospects to see what was actually happening in the marketplace and what patterns were starting to emerge. I also worked with the senior partners at Scale to determine how to keep them up-to-date with my findings and how the results would fit into their overall software investment strategy. The important takeaway was that a venture firm is not a research house. At the end of the day, they would measure the success of this effort by whether it helped them make better investments.
While I could have done my research on my own, it was helpful to have an office to go to, access to research materials and introductions from the Venture partners to companies that were using some of the new technologies. While I was primarily focused on pursuing my research, I also gave the investors my perspective on a couple of portfolio companies.
Working as an EIR was one of the most interesting projects I've done. While it was a short, concentrated time, it exposed me to some new ways of thinking about managing the growth of a software company and how to think like an investor. The contacts I made were also invaluable.
You can read the full article over at GigaOm or a slightly expanded version at Scale Venture Partners' site. The full version also includes some tips for those who are thinking about an EIR role and how to determine the best match with your own needs.
Zack Urlocker is Chief Operating Officer at Zendesk, a cloud-based help desk provider. He was previously the Executive Vice President of Products at MySQL where he was responsible for Engineering and Marketing and helped grow the company to $100 million in revenue. Urlocker is an investor, advisor and board member to several software companies. He's also an occasional marathon runner and blues guitarist.
I only recently found out about GigaOm's upcoming Net:Work conference. It's held December 9 at UCSF Mission Bay conference center. While the name of the conference is a bit ambiguous, the actual area of focus is very clear: how will we collaborate in the 21st century?
The impact of smartphones, tablet computing, social networks, Software-as-a-Service and Cloud computing is just starting. As a result, I think there are tremendous opportunities for startup companies to disrupt existing markets with more modern, lightweight applications that foster collaboration inside the company as well as with partners, vendors, consultants and customers.
Companies that can more effectively tap into talent within their organization and across traditional boundaries may end up having a significant competitive advantage. Instead of the traditional top-down view of management edicts flowing from HQ to employees and field offices, you now have the potential to develop, test and refine ideas from any part of the company or community regardless of location.
That was the approach we took at MySQL and it worked very well with employees distributed in more than 40 countries, 90% of whom worked from their homes. We also had a huge community of users we could tap into that contributed tremendous value to the company. Even though we had primitive tools for collaboration (IRC, Skype, Forums, Wikis, conference calls, mailing lists etc), we always operated with a global perspective. This enabled us to develop great talent regardless of location. Managing a distributed organization is not easy, but you get some amazing benefits if you do it right.
Speakers at the conference include Marc Benioff (Salesforce.com), Dave Hersh (Jive), Maynard Webb (LiveOps), Tom Kelly (Moxie Software), Doug Solomon (IDEO), Zach Nelson (NetSuite), Aaron Levie (Box.net), Ross Mayfield (SocialText) and more.
I wrote a guest column for GigaOm on how open source software, cloud and software as a service are helping to bring about the consumerization of IT: namely bringing simplicity where complexity reigned. I cited some examples including New Relic, Box.net and Apple.
Open source has gone a long way toward putting power back in the hands of developers, who can download, install and deploy software without having to go through any kind of convoluted sales or budget approval process. You want MySQL? You can download and install in 15 minutes, and you don’t have to talk to anyone to do it.
Software as a service (SaaS) takes this to an even broader audience, enabling employees to get the kind of lightweight, consumer, self-serve capabilities in their job without even having to run their own servers. Platforms like Amazon AWS, Heroku, Makara, RightScale and others put this same kind of SaaS power in the hands of developers...
My view: ease of use trumps a long feature list any day of the week. There are both techological reasons as well as sociological and economic reasons for why organizations are seeking greater simplicity. Part of this stems from the fact that complex enterprise applications grew beyond the ability of most organizations to successfully adopt.
It seems obvious that given the decreasing cost of storage and computation, there's going to be a significant increase in the volume of data that organizations accumulate over the next 10 years. But the type of data being accumulated may be different from the areas where traditional DBMSs dominated. It's not just about transactions; it's search patterns, on-line behavior, click-thru data, events fired off by smartphones, messages over Twitter & Facebook, log data of various kinds.
If an organization can figure out a better way identify prospects, or deliver more targeted ads, or optimize pricing decisions by analyzing terrabytes of data, they'd be crazy not to. Over the long term, companies that don't develop these capabilities will be at a competitive disadvantage.
As to what the implications are from a technological perspective, that's a whole different can of worms. I'm starting to see adoption of Big Data technologies like Hadoop, HDFS, Cassandra, MongoDB, XML databases, analysis with R, Pentaho, and loads of other technologies. And MySQL continues to play a role here as do other traditional relational databases. Over the next few months, I'm going to dig down deeper with people using these technologies to try and discern the emerging customer patterns.
If you're in this space or using some of these technologies, let me know your thoughts. What volume of data are you dealing with? How many nodes or servers are you using? Are you running on a public cloud, private cloud or hybrid? What technologies did you evaluate? What about traditional DBMSs didn't work for this scenario?
I'm the boards of two companies (Pentaho, Revolution Analytics) that are starting to see a lot of customer traction around Big Data. More and more companies in media, pharma, retail and finance are doing advanced analysis, reporting, graphing, etc with massive data sets. It made me wonder what other areas of the technology stack might evolve with the trend towards Big Data. Obviously, there's new middleware layers like Hadoop and Map Reduce, and we're also seeing the emergence of NoSQL data management layers with Cassandra, MongoDB, MemBase and others. But what about programming languages?
So why don't I have this language yet? Well, partially because programming language craftsmanship is hard. I'm pretty sure I'm not good enough to do it, which is usually my default criteria for saying something is Really Hard.
But I think as well the k3wl languages coming out are coming out of language requirements of the Top 10% crowd. They're the ones good enough to actually write the languages, and they're going to write a language that makes them happy. But then you end up with Scala, and then you end up with this monstrosity, and then you make me cry. A language in which that thing is even possible will never be a candidate as a Journeyman Programming Language.
You know who's going to do it? Someone like Gosling, who set about with the needs of the journeyman programmer in Java. But the state of the art has moved on, and Java just isn't suitable anymore.
Who I would really like to do it is Anders Hejlsberg. I am a very big fan of C#-the-Language. It's just that .Net-the-Ecosystem is so Microsoft-specific and horrific it'll never catch on in the wider world, no matter what Miguel de Icaza thinks.
This got me thinking about the challenge of the current complexity in Big Data systems. Today, you have to be near genius level to build systems on top of Cassandra, Hadoop and the like today. These are powerful tools, but very low-level, equivalent to programming client server applications in assembly language. When it works it's great, but the effort is significant and it's probably beyond the scope of mainstream IT organizations. (That's one reason that Revolution's R product has appeal, but R is a specialized statistical analysis tool, not a general purpose language.)
Could the Big Data complexity be factored out somehow with a new general purpose programming language? No doubt. Having worked with Anders on the creation of Delphi many years back, this is right up his alley. Or maybe we already have a good starting point with Erlang, Scala and Google's Go. Go is particularly interesting having been designed by Rob Pike and Ken Thompson of Bell Labs / Unix fame.
What's been your experience in programming Big Data systems? What do you think's needed? Let me know in the comments below.
Zack Urlocker is an investor, advisor and
board member to several startup software companies in SaaS and Open Source. He
was previously the EVP of Products at MySQL responsible for Engineering and
Marketing. He built the MySQL Enterprise subscription strategy and product
line. MySQL was sold to Sun for $1 billion and is now part of Oracle
Corporation. He is also a marathon runner, blues guitarist and fan of Interactive Fiction.
I managed to get an "early upgrade" of my iPhone 3GS to the iPhone 4 despite AT&T's best efforts. I've had a couple of days using the new iOS 4 operating system on my 3GS and a couple of hours with the iPhone 4. So here are a few highlights of the initial hands-on experience with more updates on the weekend.
Updated with additional information, video & photos.
Low-res video (VGA resolution):
Here's a gallery of additional photos from the iPhone 4: (Double click to see larger versions.)
Other than AT&T's longstanding inability to deal with demand, the upgrade process is pretty simple. I chased the UPS driver home to get my iPhone today and just plugged into the USB cable to restore my last iPhone 3GS backup. That took about 20 minutes. Applications, data, settings etc were exactly where I left them. However, since I manually manage my MP3 files, it didn't restore those, which is kind of a nuisance. You also need to re-enter email and WiFi passwords, which makes sense. During the upgrade process you can sign up for a 60 day trial of MobileMe, which is tempting if you own an iPad and iPhone. And you also need to re-activate the account by calling a toll-free somewhat-automated AT&T service. My hold time was just over 3 minutes and then it took another few minutes to go through the terms and conditions. In fact, I had to type or say my cell number 3 times, zip code twice and agree to the terms twice. All told it took about 10 minutes. But considering it's AT&T, it could have been worse.
But then a few minutes after syncing, I noticed that not all of my applications were restored. The New York Times, Frotz, Wikipanion, Engadget, Guitar Tab Toolkit and several others apps were missing. Not quite sure why. So I plugged in the USB cable a second time, canceled the backup and suddenly the remaining apps were being restored. That took another 20 minutes. Not sure if I did something wrong here or the iTunes Store was overloaded. But if some of your apps aren't restored initially, don't panic. But if this happens you'll also have to re-arrange the app icons back to how you used to have them.
Better battery life, better screen, better audio, better camera and for those who actually need to talk on their phone, better cellular coverage. Admittedly, it's still AT&T, but I believe the new antennae built into the casing will help. On my 3GS, I've had calls drop 4 or 5 times while driving on 280 (which has a black-hole for cell service near Sand Hill Road.) But so far, so good.
The iPhone 4 is slightly skinnier than it's predecessor, and a bit more squared off, but to me the differences are subtle. Its the same weight and doesn't really feel much thinner, not that that was an issue. If you had a third-party case for your old iPhone it may or may not fit the new one, depending on how snug it was to begin with. My old soft rubberized case seems to hang a bit loose, like pants a size too large, but it's not far off. If you're into design then yes, the iPhone 4 has got a modern-retro cool style. But to me it's not a big deal.
The new screen is better, but again, it's a fairly subtle improvement. However web sites with small fonts, like the mobile version of TechCrunch, are definitely more readable. And even existing built-in apps benefit from the higher res fonts. In side by side comparison, the new screen is sharper and seems to have better contrast, making it easier to read. For news applications, it's almost like reading a printed magazine, albeit a very small, fussy one.
Similarly, performance is a little faster for some apps. For example, Google maps screen refresh is noticeably snappier than before. And in side by side comparisons, for example, updating stories from the New York Times or AllThingsD, the iPhone 4 is consistently faster. Not a lot faster, and not in itself enough to make a huge fuss about, but I'll take it.
The camera, on the other hand, is noticeably improved. I often end up at conferences panels or blues clubs where I don't always have my trusty Canon G9. In these cases, the lighting conditions are never ideal and as a result, the iPhone 3GS camera just doesn't cut it. And in my experience, the 3GS video was completely useless as any volume of live music (say 100 db, which is loud but doesn't require earplugs) gets clipped and distorted by the built-in microphone.
The iPhone 4's camera is much improved. The pictures are high-res (5 mb versus 3) but the real improvement comes from being more sensitive in low light conditions. The iPhone 4 also has a front VGA (640x480) low-res camera used by the FaceTime video conference call app that is also suitable for quick self-portraits with less fumbling around. While the camera isn't perfect, it's miles better than the 3GS and can match low-end point and shoot or Flip video cameras. I've posted two photos in crappy lighting and the iPhone 4 makes a decent job of it.
Sample high-res shot:
Sample low-res VGA shot:
The video is ok in low-light but the result is a very grainy image, almost on par with the latest Flip video camera, but not really comparable to a high-end point and shoot camera such as the Canon G9. But it is definitely much better than the iPhone 3GS and more convenient than carrying an iPhone and a Flip. Here's a quick and dirty use of the low-res VGA video capability:
(My apologies for the guitar playing!)
And some live concert footage at about 100db:
The audio is definitely better than the 3GS, but the picture is very grainy.
Improved Cellular Reception
While there has been some questions and comments about reduced cell reception depending on how you hold the iPhone, I haven't had any problems. (Hint: avoid directly touching the antenna in the lower left corner when you hold the phone.) Still, I put a piece of tape over the lower left corner antenna just to be on the safe side.
In fact, as the video at the top of the post demonstrates, I was able to make continuous calls on several notorious silicon valley dead spots, including Highway 280 near Sand Hill Rd and Highway 17 to Santa Cruz. However, I did lose reception in a tunnel (to be expected), on the Bay Bridge and on rural Highway 9 in Saratoga. But this was still fewer dropped calls than usual.
If you've been frustrated by dropped calls with the 3GS, this improvement alone may be worth the upgrade price.
The speaker is also slightly clearer which is useful if you do conference calls or play music from the speaker (which I do on occasion.)
The new iOS 4 is good on the 3GS but it really rocks on the new hardware. Not only is the multi-tasking quick, but the better hardware makes even existing applications look better and run faster. Hopefully in the weeks to follow we'll see more applications updated to use the new multi-tasking.
Note that the multi-tasking on the iPhone is not the same kind of flat-out full-on multi-tasking you may be used to on a desktop compute. It's really more of an intelligent quick-restore of an application with some limited multi-tasking for maintaining cell connection, playing music, getting notifications etc. On a handset, this seems to work fine. It's not like I need a massive spreadsheet to recalc or some kind of long-running DBMS transactions to go run in separate threads. But we'll see in the fall whether this same model works as well on the iPad.
Nonetheless, the multi-tasking, is a intuitive as you could imagine. Double click the iPhone button below the screen to pull up your recent or running applications. So you don't have to go back through the home screen and scroll through pages of apps when you, say, confirm a calendar appointment in an email while talking on the phone and looking at a map. It's not as good as multiple desktop apps on the screen at once, but the experience works well on the small screen of a smartphone.ns.
Some of the built-in apps are also improved. For example, iPhone email now has a unified in-box and threaded conversations. With a unified in-box, I can now finally start to move off hotmail and over to gmail without having to manually check email in two places.
You can also now run the iBooks application on the iPhone with bookmarks and content synchronized. If you're happy with your 3GS and just want multi-tasking and a unified email inbox with threaded messages, and iBooks, you can get all of that with an upgrade to the iOS 4 platform for free.
Apple has also introduced FaceTime an iPhone4-to-iPhone4 video conference call capability. Unfortunately, it runs only over WiFi. Still, it could be a useful application for those who travel a lot.
Overall, iPhone 4 is an incremental improvement. I am not sure whether I would label it game changing; that depends on how much you use FaceTime, iMovie or other new applications that have yet to be created. But it is certainly a worthwhile upgrade, just to get the improved battery life, camera and cellular reception. But if you do the upgrade, note that it can take about an hour to backup your old phone, restore on the new one and activate the account with AT&T. Don't attempt this if you need to use the new phone in 10 minutes.
If you can get the subsidized price or early upgrade and can live with AT&T, then it's $200 well spent. Otherwise, you may have to wait for Verizon to pick it up next year.
With the iPhone 4, Apple has once again set and raised the bar.
Oracle managed to score a major victory last week at the MySQL Conference by announcing performance gains of 200-360% in the forthcoming version 5.5. This is a tremendous improvement and comes in part due to closer collaboration between what were historically two distinct (and occasionally competitive) groups: the InnoBase team and the MySQL Server team. Bringing the InnoBase team under the direction of the MySQL Server team under Tomas Ullin is a great benefit not only to MySQL developers, but also for MySQL users. No doubt these performance gains are a result of many months of hard work by not only Tomas, but also a good number of folks on both teams including guys like Mikael Ronstrum, Kojstja, Calvin Sun and others.
It seems that in the MySQL 5.5.4 release, several performance bottlenecks that really affected scalability beyond 4 cores have been either eliminated or seriously mitigated. Some of the changes were in MySQL itself, while others are InnoDB specific...
The benchmarks presented that compared MySQL 5.5.4 with 5.1 show substantial improvements in a variety of workloads. And given how many shops are still running MySQL 5.0.xx in production (including us), that means there really is A LOT to look forward too–especially on newer hardware.
I, for one, cannot wait to see what this stuff does for us.Thanks to the MySQL and InnoDB teams for their continued hard work and dedication to making MySQL faster as hardware evolves.
For those who have been skeptical, these results should go a long way towards demonstrating Oracle's commitment to ongoing investment and improvement of MySQL. Who knows, maybe this will help eliminate some of the rhetoric and FUD from the splinter groups in the MySQL community. And of course, Oracle will need to continue to ramp up investment in other areas of MySQL to make good on their promises. But they're off to a better start than anyone could have expected.
I've included some video excerpts from keynote presentations by Oracle VP Edward Screven and from Open Source maven Tim O'Reilly below.