.EXE Interview with Anders Hejlsberg on Delphi (1995)

To commemorate the 25th anniversary of Delphi on Feb 14, 2002 here is a transcript of an interview with Anders Hejlsberg, Chief Architect of Delphi conducted by .EXE Magazine editor Will Watts from 1995. Anders discusses the design and development of Delphi and the then forthcoming 32 bit version for Windows 95. This was the most detailed technical interview published about Delphi at the time.

Q. How did the idea for Delphi evolve from Turbo/Borland Pascal? At what stage did you decide to add the environment, database support etc?

Delphi andersA. The key idea was to design a tool that combines visual development environment, Client/Server database support, and a native code compiler. Before Delphi, you always had to make a choice. Do I go for the performance of a native code compiler, or the ease of use of a visual development environment? Do I go for a powerful object-oriented language, or a proprietary 4GL Client/Server tool? What programmers really want is all of the above, in one package. That's what we set out to do with Delphi.

What it really boils down to is productivity--we wanted to design a tool that would make developers more productive, all the way from prototype to production code. Other products lure you with visual tools, but once you get halfway through your project, they let you down because of sluggish performance, lack of extensibility, or general stability problems. The competition talks about adding extensibility and improving performance. That's a fundamental difference between their products and ours. Extensibility and performance was on the white-board the first day we started designing Delphi, and it permeates the entire product. For example, if you want to design a new component in Visual Basic, you have to write it in another language, such as C or C++ (or Delphi, for that matter). None of your VB skills can be reused, you have to learn a different language, and you can't easily inherit from any of the built-in components. Delphi, on the other hand, allows you to write new components in Delphi, and you can inherit from any of the built-in ones. That's true extensibility, and it translates into a substantial productivity boost.

Another key aspect of Delphi is its versatility. Other tools tend to focus either on Windows application development or on Client/Server development, and one always trades off the other. Delphi is equally adept at both, as is evident from the kinds of applications our customers are building. They range from shrink-wrap Windows utilities and multi-media games, through desktop database applications, and all the way up to multi-user enterprise-wide Client/Server solutions. The point is that almost any Windows application needs some form of database access, and any database application needs some form of Windows specific programming--to be productive, you need a tool that does both.

Delphi really leverages a lot of very mature database technology from Borland including ReportSmith, the Borland Database Engine, SQL Link native drivers for remove servers and the Local InterBase Server. Just the InterBase server alone is a tremendous technology that gives developers the ability to use full ANSI-92 SQL in their applications so they can begin exploring SQL and client/server development all on their local PC.

Q. You emphasise Delphi's versatility as an advantage, but surely it is also a drawback? If one needs to build a client/server application, PowerBuilder offers better CASE/database management facilities than Delphi.

A. There's an inherent advantage to being versatile. Look at the computer on my desktop. Do I need a dedicated word processor, a PC for my spreadsheets and a terminal with access to my customer records? No, I've got one PC that's versatile enough to do all these things.

A very large American retail chain--one of the largest--just standardized on Delphi over PowerBuilder precisely because their engineers can do 85% of all their work using Delphi versus 60% of their work using PowerBuilder. That saves them enormous amounts of money and complexity, including in ways you may not have considered. As an example, skills and techniques learned writing a small utility are directly applicable to client/server projects. A lot of today's programmers started out by writing those little command-line utilities in the good old days. It's a great way to experiment with and master the use of data structures, object-oriented techniques or learning about the Windows API. Consider, also, how using the same tool for a broad range of applications provides a company with a neat training path: Someone can start writing non-database programs and then gradually move onto projects dealing with valuable corporate data.

There's no end to the components and views you can add to Delphi. The population of programmers who can build components in Delphi is much larger than with any other tool on the market. We're back to the days when one programmer in one room can build and test something that can be used by tens of thousands of other people. Can you imagine what the
availability of specialized component sets will be like in six months? In a year?

I think the entire point of combining a component-based visual development environment with an object-oriented compiler and database technology is to make sure you never run out of gas. That's not a bug--it's a feature.

Q. If you want a quick and dirty hack, surely it makes sense to use Visual Basic, because everybody can use it without having to master a scary, complex language like Pascal. If you are doing multi-media or real time work, why mess around with a system which delivers slightly slower performance, and requires you to hand-translate all the header files for any DLLs you may need, when you could just use C++?

A. As we like to say, "It's not your father's Turbo Pascal any more". We made sure that the Object Pascal code you have to write is as easy as BASIC but without limitations.

We've taken great pains to make sure that when you're interacting with components, the code you write is as simple as possible--but no simpler. Many reviewers have remarked that they thought they were coding in Basic when they first started using Delphi. It's that easy. When they want to do something more interesting and start using the richness of the language, they usually start remembering how much they like Pascal.

In fact, I think you miss an essential advantage of Delphi. Anybody who has used a compiler--especially one that supports good type-checking--knows that a compiler is really a programmer's best friend. When it tells you it's probably not a good idea to take the square root of your Window caption, it's showing you a logic error in your code and saving you time. Is it an advantage that BASIC will perform automatic type conversions in that circumstance instead of giving you an error? I wish my spell checker program could complain about the logic of a paragraph I've written in the same way as our compiler warns you about illogical programming statements. Our 32-bit compiler goes even further and offers you all sorts of hints about problems it detects in your program. This kind of help is invaluable and one of the things that makes programming in Delphi very productive.

Q. What is the secret of Delphi's fast compile/link cycle?

A. Borland has over ten years of experience in building the world's fastest compilers, and we've put that knowledge to good use in Delphi- -it compiles at about 350,000 lines per minute on a 90 Mhz Pentium. A number of factors comtribute to this throughput. Delphi units (code modules) compile to .DCU files, which you can think of as a combination of a C++ precompiled header file and an .OBJ file. (It's funny how the hot topic in the C++ community is pre-compiled header files and incremental linking--Borland's Object Pascal technology has had these features for more than eight years.) Delphi units specify what other units they depend on through USES clauses--sort of like C++ #include's of header files. By analyzing the USES clauses of each unit in a project, the compiler can automatically perform minimal builds with no need for a make file. The net result is that the compiler never compiles more than it has to, and it never compiles the same thing more than once. Finally, the clean syntax of Object Pascal allows for very fast parsing.

Q. Is the compiler engine itself written in Delphi? How much does it differ from the Borland Pascal 7 compiler?

A. The compiler is written in assembly language. It is fully backwards compatible with BP7, and we've added lots of object-oriented extensions such as class references, virtual constructors, and the IS and AS operators. We did a lot of work to enable declaring, registering and filing properties and we generate run-time type information that's used to communicate published property, event and method information to the development environment. You'll see some interesting applications of that capability in our 32-bit release. One very unique enhancement was our use of bound method instance pointers to implement event delegation. They're very efficient and fit nicely into the language. And of course we did a lot of work to add structured exception handling. In addition, there are lots of little niceties that people have requested, such as support for C calling conventions.

Q. Delphi implements objects in a manner similar to Apple's Object Pascal, with all objects allocated on the heap. Previous versions of Turbo/Borland Pascal used a more C++ like approach, with the ability to alocate objects on the stack and statically. Can you explain the reasoning behind this change in approach?

A. It really is a question of features vs. complexity. The philosophy of Delphi's Object Pascal language is to deliver the RIGHT set of language features, as opposed to any language feature ever known to mankind. It's the well known 80/20 rule: You can get 80% of the power for 20% of the complexity, but squeezing out that last 20% of power makes the whole thing five times as complex to program. Mixing static and dynamic allocation of objects is one of those features that fall into the latter group. By implementing a pure reference model we were able to simplify the entire Delphi component library, and do away with a lot of the pointer management that plagues other products. Even though Delphi objects are allocated on the heap, in a typical Delphi application you never have to deal with allocating and freeing them.

Q. I find this answer quite surprising and counter-intuitive. You had already implemented mixed static/dynamic allocation, and therefore presumably cracked the problems involved, so why go to the trouble to revert to the Apple Object Pascal approach which you had initially rejected? Is, say, a stack allocated object, with constructors and destructors automatically called as the thing moves in and out of scope, really more complex than a heap allocated object, where you must make special provision to kill the thing off at the end of its life? I would have thought that the fact that the component library *mostly* frees objects automatically but *sometimes* doesn't would tend to add to rather than reduce the application programmer's burden. Also, the change in model must confuse both existing BP programmers and also migrating C++ users.

A. Again, we didn't revert to anything because we really started with a clean slate. Our class reference model is sufficiently powerful and flexible, so having only one sort of class is actually an advantage. Once you give someone two ways to do the same thing, you have made your product less usable and you have to now help them understand when to use a statically allocated class versus a dynamically allocated one. We're quite happy with the choice we've made. It's simple to understand, efficient, and allows us to add garbage collection in some future release. And, of course, if you've got old code from BP7 that uses old style objects, you can still compile it from within Delphi.

Q. Exception-handling - what were the major influences on your design?

A. We looked at a number of languages and implementations, and were most influenced by C++ and Modula-3. Delphi is like C++ in that exceptions are classes, but more like Modula-3 in terms of the supporting language constructs.

Exceptions are a quiet revolution--they truly simplify the way you write code. For the most part you can write your code as if errors will never occur, instead of spending the bulk of your time trying to determine if an error occurred, and if so, how best to clean up and back out of what you were doing. Delphi's Visual Component Library was designed from the ground up with exception handling built in, and that is a large part of the reason why Delphi and applications written in Delphi are so fault tolerant. One of my favorite demos is a little two-liner that, on the click of a button, assigns NIL to a pointer, and then dereferences the pointer. Each time you click the button, Delphi reports that a General Protection Fault exception has occurred, but because of the built-in exception handling logic, the app keeps running instead of bringing itself down.

Q. I'd like to draw you out a bit to expand the answer above with a few specifics.

A. As in C++, an exception in Delphi is simply a class, which means you can take advantage of the inheritance mechanism to handle whole sets of exceptions easily. For example, Delphi declares the following classes which deal with floating-point exceptions:

type
 EMathError = class(Exception);
 EInvalidOp = class(EMathError);
 EZeroDivide = class(EMathError);
 EOverflow = class(EMathError);
 EUnderflow = class(EMathError);

As you can see, EMathError is the ancestor of the other exceptions. Here's an example of a TRY..EXCEPT statement that handles floating-point exceptions 

try
 PerformCalculations;
except
 on EZeroDivide do ...;
 on EMathError do ...;
end;

If the PerformCalculations procedure raises an EZeroDivide exception, it is handled by the first handler. If it raises any other EMathError exception, the second handler takes care of it. Since there is no ELSE clause, no other exceptions are handled--they are instead propagated to an enclosing exception handler.

Q. Delphi's ability to handle GP faults is indeed one of it's neatest tricks. Was it difficult to implement?

A. It wasn't too bad, but it did take some nitfy use of TOOLHELP.DLL which implements the Windows low-level system tools interface. We basically register an interrupt callback function which maps processor faults into Delphi exceptions. The reason that it all works, though, is that VCL was engineered from the ground up to be exception aware. Because
of that, when a GP fault occurs and is mapped into an exception, the operation that was in progress will automatically know how to back out and clean itself up.

Q. Can we expect any other major syntax additions/changes, for example Eiffel style assertions?

A. We're always evaluating new language features, and surely there will be some in the upcoming 32-bit version. I'd rather not get into specifics, but as a rule, we don't really think about language extensions in the abstract. Instead we look at the language as part of a bigger picture (class library, component model, visual environment) that must evolve as a whole to support new technologies and improve ease of use.

Q. Can you give Delphi programmer's any guidance on how best to write applications that will be portable to the 32-bit version of Delphi? The new "Cardinal" data type has arrived almost completely unnoticed. Are there any other issues we should be aware of?

A. Delphi's Visual Component Library was designed with portability in mind. As long as you stay away from in-line assembler, 16-bit pointer atrithmetic, and Windows 3.1 API functions which aren't supported in the Win32 API, your apps should port with little or no modification.

The Cardinal and Smallint types were introduced to facilitate portable code. Of the built-in types, Shortint, Smallint, Longint, Byte, and Word have identical representations in 16- and 32-bit code. The Integer and Cardinal types, on the other hand, represent the most efficient signed and unsigned integer types of the particular platform. In the 16-bit version they are 16-bit entities, and in the 32-bit version they are 32-bit entities. In general, you should use Integer and Cardinal whenever possible, and Shortint, Smallint, Longint, Byte, and Word only when the exact storage representation matters.

Any 64K limitations found in the 16-bit version will disappear in the 32-bit version. For example, the 32-bit version allows you to declare arrays and allocate heap blocks of any size up to 4GB!

Q. What is the current state of the 32-bit version? Will it support 16-bit VBXs, like BC++? Delphi 16-bit code runs somewhat slower than C++ - are you doing anything about this for the 32-bit version?

A. Delphi was written to be portable--we've been working on the 16- and 32-bit versions in parallel since day one. The 32-bit version is in field test now, and it will ship shortly after the commercial release of Windows 95. Yes, there is a foundation of 32-bit VBX support technology available in-house, but our primary focus is OCX controls. That's what the competition is working on, and that's where we see the market going. With respect to better code generation, Delphi-32 generates the same high-quality code as Borland C++ 4.5--in fact, they use the same optimizing back-end code generator.

Q. Are there any plans for Borland produced or badged add-ons for Delphi, in addition to the Visual Solutions Pack?

A. We just released the RAD Pack for Delphi, which includes Turbo Debugger for Windows, Resource Workshop, the Resource Expert, Visual Component Library source code, the--much requested--Language Reference Manual, and Visual Solutions Pack 1.1. We did have some quality problems with the initial release of VSP, but those have been resolved, and we now have a Companion Products group to provide Borland-quality add-ons, such as Notes support for Delphi programmers and other often requested components.

Q. Delphi is a terrific tool for rapidly developing state of the art software, but a number of shareware authors have expressed a wish that executables could be made smaller. Is it technically feasible to create a DLL-based version of VCL? Surely this must be possible since COMPLIB is a DLL which is used by the Delphi design environment?

A. It's something we're looking at, and certainly some of the 16-bit complexities with respect to multiple DLL clients are gone in 32-bit land. At this point I can't really comment on specific solutions, other than to say that we're actively looking at ways to make our executables even smaller.

Q. A long-standing and major criticism of Borland Pascal is the proprietary nature of the object file format. It's appreciated that going to the OBJ file format would be a retrograde step, but why won't Borland at least document the file format? That way, developers can create their own tools such as disassemblers, C to Pascal linkers and so forth. Again, it's understood that the file format changes with each release of the compiler, but documenting the changes with each new version would enable other developers to create conversion tools even if Borland don't want to do this. At the moment, if you don't have the source code, all your units become useless each time the compiler is updated.

A. We're well aware of these issues, and the 32-bit version will address them in a number of ways. What I can tell you at this point is that the 32-bit compiler has an option to produce
.OBJ files, which can be linked with .OBJ files produced by other compilers.

Q. A related issue: the move to Windows has diluted the importance of the OBJ issue, because you can now call DLLs. But the Delphi user has still to translate the (typically) C/C++ headers into Delphi import units, an exercise which is at best tedious and time consuming and, if you happen not to have had C++ experience, quite hard. It's the sort of job best left to a machine. Given that Borland has a lot of C++ parsing expertise lying around on the ground, have there ever been any plans to create such a tool?

A. Well, I'm not sure which C/C++ headers you're talking about.
We've already translated all the Windows and OLE 2 API header files, and corresponding interface units are included with Delphi. But you're right, if you have a 3rd party DLL that was previously only interfaced to C/C++, somebody will have to do the translation. Usually, it's not that bad and I think you'll see an increasing number of vendors providing Delphi interface files for their DLLs. Also, I think you'll see more and more products take advantage of the OLE 2 ITypeLib and ITypeInfo interfaces, and we'll provide a tool that takes that information and produces a Delphi interface unit.

Q. The ability to create a single EXE for redistribution is very attractive, but somewhat spoilt by the need to include the BDE with database applications, even if they only want to access the odd DBF. Any plans to clean this up?

A. We're working with several third parties, including SAX Software, Eschelon Development, Sterling Software, Great Lakes Software, and Shoreline Software. They have, or will soon have, products to help you deploy your Delphi database applications. In addition, we're making a deployment kit available, via CompuServe and Internet.

Q. Also on data access: is it possible to modify/inherit from the data access controls to provide, for example, 'native' access to FoxPro/Clipper databases? If so, are any such products being developed by Borland or Third Parties?

A. I know of several Third Parties working on native access to FoxPro/Clipper as well as B-Trieve. Some of them are in beta at this point. You can contact them for additional information on the DELPHI CompuServe forum or find out about them in the Delphi "Power Tools" catalog.

Just to clarify: Sax, Eschalon, Sterling etc do install/setup tools that work with Delphi and the Borland Database Engine. There are other companies creating components that provide direct access to BTrieve, Fox, and other database formats.

Q. What is Delphi's main niche in the developer tools market? Compared with, say, Visual Basic, PowerBuilder and C++?

A. Delphi is a general purpose Windows RAD (Rapid Application Development) tool. The point is that Delphi is NOT a niche tool. From the onset, we've designed Delphi to be able to take you from prototype to production, be that whether you're targetting a Client/Server environment or just writing a Windows application. I hear our competitors say that you can use their tool for rapid prototyping, and then port your app to C++ for production. But you know, rapid application development isn't really rapid unless you can go from prototype to production, all using the same tool! I also hear how competing products will address performance issues by generating C or C++ source code. This idea of building the application with one tool, and then having it generate C or C++ source files that have to be run through another tool, is ludicrous. How are you expected to debug the final code? Requiring users to find bugs in machine generated C++ code, and understand how that maps to the original 4GL code, just doesn't make sense. We've been shipping development environments with integrated compilers for 12 years--I think the time is gone when programmers would accept anything else.

Q. How have Borland's recent troubles affected Delphi's development and its take-up? Was the absence of a Language Reference Manual in the initial product a consequence of these troubles? (You had to mention the Language Reference Manual?  My mistake, I'm sorry! --Zack)

A. In the two years we were developing Delphi, the company did go through some difficult times. That was all resolved before we shipped. Now the entire company is focused on development tools, we've won the Lotus lawsuit, we've launched Delphi and Delphi Client/Server worldwide, and both products continue to sell well above expectations. In fact, I understand that Gray Matter reports that Delphi is the #1 selling development tool in the UK. The Delphi development team is 100% intact, and focused on Delphi for Windows 95.

We really underestimated the demand for the Language Reference Manual. (You're telling me! --Zack)  It will be included in Delphi for Windows 95. Meanwhile, we've made good by uploading an Adobe Acrobat version to CompuServe and our WWW page, and a printed version is now also available from Borland.

Q. We are all pleased you resisted the opportunity to christen a product Power Visual Turbo Pascal Objects for Windows - but how did it come to be called `Delphi'?

A. We actually tried to call it "Power Visual Turbo Pascal Objects for Windows", but that name was already trademarked :-). One of the senior guys in QA (Danny Thorpe) dreamed up Delphi as a code name quite early on, and everytime we did a market survey of product name candidates, everyone said "well, those are ok, but we really like `Delphi'". So in the end, we kept it.

Q. Which part of Delphi are you most proud of? ... and which part least?

A. The thing I'm the least proud of is probably the initial lack of a Language Reference Manual. But that's taken care of now. (Dammit, I said I was sorry already! --Zack)

What I'm most proud of is the fact that the energy we invested in foundation technologies like extensibility and exception handling enabled us to build Delphi in itself. Can
you imagine VB or PowerBuilder written in themselves? By building Delphi in Delphi, we really got to feel on our own bodies what was right about the product, and what needed fixing. I sometimes hear frustrated users comment "The programmers that wrote this %&##$ thing should be forced to use it themselves!". Well, we did, and we're really proud of the result.

-------- CHRONOLOGY --------

  • 1960 Anders born in Copenhagen, Denmark.
  • 1979 Enrolls at the Danish Engineering Academy. Co-founds PolyData, one of the first Danish microcomputer companies.
  • 1980 Releases his first Pascal compiler--a 12K Pascal subset in ROM for the British NASCOM Z-80 based kit computer. Eventually sells the rights to this product to Lucas Logic.
  • 1982 PolyPascal for CP/M-80 released. Product is now a complete implementation of the Pascal language.
  • 1983 Sells the Borland founders (Niels Alex Jensen, Ole Henriksen, Mogens Glad, and Philippe Kahn) on the idea of a Pascal compiler with an integrated editor. In November releases Turbo Pascal 1.0 for CP/M-80, CP/M-86, and MS-DOS. The newly formed Borland company, essentially penniless, places an advert for Turbo Pascal in Byte, bluffing the Byte Ad executives into giving them credit. The compiler is priced at $49.95 and is an instant smash hit.
  • 1986 Turbo Pascal 4.0 released, featuring an Integrated Devlopment Environment--the first of its kind for the PC environment--and introducing modular compilation (previously Turbo Pascal programs had to be compiled all in one go and could be no larger than 64K unless overlays were used). CP/M support is dropped.
  • 1988 Turbo Pascal 5.0 released, featuring integrated debugging and VROOMM (Virtual Runtime Object Oriented Memory Manager) overlay management technology.
  • 1989 In response to Microsoft's object oriented QuickPascal, Borland releases Turbo Pascal 5.5, which has its own OOP extensions. Microsoft later drops QuickPascal from its product line.
  • 1990 Turbo Pascal 6.0 features a new, much improved Integrated Development Environment, and includes the Turbo Vision object-oriented application framework.
  • 1991 First release of Turbo Pascal for Windows. Features a Windows hosted IDE and the ObjectWindows Library (also known as OWL).
  • 1992 Borland Pascal 7.0 includes both a DOS and a Windows hosted IDE, and allows developers to target DOS, DOS Protected Mode, and Windows.
  • 1995 Delphi and Delphi Client/Server released on schedule on February 14th.

--------

This interview originally appeared in EXE, the UK's leading programming magazine. If someone can send me a .EXE cover or logo, that is appreciated!
Thanks to Ben Riga for sending it to me. Thanks for also not mentioning the Language Reference Manual.


San Francisco is Breaking My Heart

Homeless sf 1

I moved from the Bay Area about five years ago to live in Michigan. However, I am still back in the Bay Area routinely working with software companies and it's breaking my heart. When I was at Zendesk I lived near our first office (510 Townsend) near Caltrain. When Zendesk moved to Market and Sixth street, it was a kind of a sketchy area at night, but I got used to the chaos of being near the Tenderloin district. There were routinely fights in the alley behind our building. I saw a woman servicing a guy in a doorway as I left work. Another time a woman ran naked screaming running down the street. Someone got shot one time late at night outside our office in what we learned was a drug deal gone bad. An employee got mugged one evening and chased the mugger, which is not really best practice, but he was Ukrainian and he didn't take shit from anyone. My wife and I were on a muni going home one evening and a guy stole the entire role of bus transfer tickets and ran off the bus. The bus driver ran after the guy. When he returned ten minutes later, he said the guy had done the same thing a week earlier. 

My point of this, is there's been a homeless problem for a long time. Long time residents know that this goes back to the 1980s. Hell, maybe earlier than that. But you could walk down Market street during the day and it was okay.  Market and Sixth was crappy, and the Tenderloin was sketchy, but that was it.  The financial district was pretty good. Downtown was fine. North Beach was cool, Embarcadero was ok. I used to run along China Basin and that was fine too. If you squinted San Francisco was still a nice place.

I don't think that's true anymore.

This isn't Times Square in the 70s or Detroit in the 90s. It's far worse than that. It is epidemic. 

I've been helping out a company located and Fourth and Market, one block from Moscone Convention Center. It's a shit hole. Market and Fifth is worse. Market and Sixth looks like a third world country with people selling random stuff, iphone cords, CDs, mouthwash, razors set out on a piece of carpet on the street. You can't walk down Market more than a hundred paces any time of day without finding a body on the ground. Drunk, drugged, whatever. There are tents on sidewalks, people sleeping in doorways. Human feces on the street, needles, you name it.

Today I was meeting with a colleague at Peet's on Market. It's pouring rain and a young man comes in, thin, somewhat scruffy in appearance, likely homeless, bleeding from his ear and sits near us. I offered him some ibuprofen and gave him a wad of cash. He spoke broken English and didn't seem to want either, but I left the cash in front of him. Sometimes I give homeless people food, but rarely money.

I don't know what to think of SF anymore.


General Magic

General magic 400

I realize I'm late to the party on this, but I happened to catch the documentary "General Magic" recently and it's an amazing piece of work. It had been on my list of films to watch since earl 2019, but somehow never bubbled up to the top. So it was fortunate that it was available on a Delta flight I had recently. For anyone who wants to understand the ups and downs of what it's like to be in a high-tech startup, this is an in-the-trenches documentary that captures startup reality better than anything else I've seen.

The film documents the rise (and fall) of Apple spinoff General Magic, a company that embarked upon the audacious goal of producing the first connected, handheld personal digital assistant (PDA), predating such devices as the HP 95 LX, Psion 3, Palm Pilot. General Magic was founded in 1989 by Marc Porat, Andy Hertzfeld and Bill Atkinson, the latter two among Apple's most famous engineers for their work on the original Apple Macintosh. The company became a hotbed of innovation and secrecy. No one quite knew what they were up to, but given the pedigree, it was going to be big. The company attracted dozens of other prominent engineers and up and comers including Susan Kare who designed the UX, Tony Fadell who went on to help create the iPod, the iPhone and Nest, Andy Rubin who created Android, Pierre Omidyar who created eBay.

General magic prototypeThe documentary includes a large quantity of historic footage. The team at General Magic knew they were working on something important and they hired a team to film all of it. So there are team meetings (bean bags on the floor), demos, late night sessions, press conferences, nerf-gun fights and more. The historic footage is interspersed with contemporary interviews with key executives, press and analysts looking back on what they accomplished and why the company failed. 

It's a heartbreaking story. Here's a company with vision, financial backing, genius engineers, and a huge market opportunity and its obliterated to the point of obscurity. I doubt that today's startup engineers have even heard of General Magic, the Magic Cap operating system or any of the tools from that era. But the irony is that all of the technology became widespread within 20 years. That includes technology that later surfaced in the iPhone, Android, Twitter, Amazon. Ultimately, the people behind General Magic created multiple billion dollar technologies and arguably entirely new industries. It was the hard lessons of General Magic's failure that ultimately resulted in the triumph of its many brilliant engineers.

Having worked in the software industry since the late '80s, this film captures the essence of Silicon Valley better than anything else I've seen. It's a powerful tribute to what it takes to succeed in the valley and asks the question: is it worth it? 

Here's the trailer:

 


iWoz - Steve Wozniak

Iwoz

I must have read a dozen books about the founding of Apple Computer over the years, so when co-founder Steve Wozniak wrote his autobiography iWoz a few years back, I made a mental note of it, but never got around to reading it. Woz and I happened to be speaking at the same conference last week. They say you should never meet your heroes, but meeting Wozniak was a real honor. He is one of the most talented engineers and nicest guys I've met. His onstage interview was a bit scattered (imagine opening a faucet of Woz), but it was great to hear him speak about his experiences in designing the Apple II computer, how Basic and Visicalc helped turn the Apple II into a platform, the importance of privacy in social media, etc. 

iWoz is co-written with veteran PC Week reporter Gina Smith though told in Woz's unique style. Smith spent over a thousand hours interviewing Woz and then transcribing and reviewing the interviews and suffering through numerous Woz pranks. Woz's sense of humor and lightness come through in spades.

Zack wozAlthough Woz was less a part of Apple than Steve Jobs in later years, those who know the history know that Apple would not have existed without Woz. The Apple II computer effectively created the personal computer revolution. It was first all-in-one design, the first computer that you could plug in and use. The Apple II+ was the first computer I bought and I still have one in my closet. It works perfectly 40 years later.

With the average iPhone app today weighing in at 100mb or more, it's hard to appreciate how much work went into creating software that could run on a 48k machine running at 1Mhz. It was Woz and other early apple employees like Randy Wigginton, Chris Espinosa, Bill Fernandez, Daniel Kottke, who labored to make that Apple II a reality. Woz, gave some of these early employees shares of stock worth tens of millions of dollars out of his own pocket when Jobs refused. 

For those interested in the history of Apple, this book is a unique opportunity to hear it from the guy who built it. You get everything from selling blue boxes with Steve Jobs to how he built what became the Integrated Woz Machine (IWM) disk controller for the Apple II and Mac.

Here are a few excerpts from the Q&A with Woz:


More Tech IPOs

Pagerduty IPO

Looks like we're in another boom year for Tech IPOs! While everyone is obsessed with the mega IPOs like Uber, Airbnb and the like, I think we're seeing a very healthy number of B2B IPOs happening. Zoom and PagerDuty have gone out successfully and others like Fastly and Slack are on deck. I was an advisor to PagerDuty founder Alex Solomon in the early days, and it's great to see how the company has continued to grow. It's a company that had its share of growing pains in the early days as they built the management team. Ultimately Alex recognized he needed an outsider to scale the company to IPO level. He hired Jennifer Tejada who has done a fantastic job, while he's chosen to focus more on the technology. (Both of them are in the photo above.)

Here's to the many other B2B tech companies that have crossed $100m in revenue and are heading for their IPO.


Congrats to Duo Security!

Zack Duo Cisco 1
Congratulations to Duo Security, which announced that it is to be acquired by Cisco Systems for $2.35b. This is a great outcome for all involved, and I'm very proud of what the team has accomplished.

I worked with Duo for about three years, initially as an advisor and ultimately as Chief Operating Officer running Sales, Marketing, Products, Engineering and Services. I helped grow the company from around $7m in Annual Recurring Revenue (ARR) to about $100m. The company has continued to grow to 700 employees, 12,000 customers and revenues that I estimate could exceed $200m ARR by year end, based on prior published numbers.

Duo is the fastest growing company I've been a part of; faster even than Zendesk or MySQL. When a company grows this quickly, it becomes a different organization every year. The early sub-$10m revenue company is quite different from where Duo is today. I would always tell new employees to be prepared for change. What remained constant was an underlying set of midwestern values of hard work, customer care and innovation that made Duo special. (Also we had a really good fun band called "Louder Than Necessary.")

It's a testament to the founders' vision and the management skills of the leaders we recruited that the company scaled so well. I remain especially proud of the many people we hired, promoted and developed to become the future leaders in the company. As news of the acquisition came out, many people have asked me about the deal, so here are my thoughts...

First of all, this should be recognized as an absolute success for the management team. To grow a company to this size and value is very rare. Less than 1% of venture-backed companies get to a valuation of $1 billion. It's also one of the biggest software successes in the midwest and proves that you don't have to be in Silicon Valley to win big. (Duo has in fact expanded into San Mateo, Austin, London and Detroit. Part of Duo's success is due to these multiple locations, but that's a another story.) 

Secondly, this deal creates a larger force in the industry. There is no doubt that Duo could have proceeded towards an IPO in 2019; they had in fact hired several new executives to lead these efforts in recent months. But the combination of Cisco and Duo together is more significant than Duo on its own. There has been a large amount of consolidation in the security space in the last few years and I believe Cisco will emerge as a leader. They have the respect and attention of Global 2000 CISOs and CIOs, a strong worldwide sales machine and a large number of related products. In a few years time, it will be clear that Microsoft isn't the only company that is capable of reinvention. 

Duo-arr-growth-chartThirdly, this represents an opportunity for further growth for the team at Duo. Cisco plays on a larger global stage than Duo could on its own. But Duo's executives, managers, engineers, security experts, sales people, support engineers, product team and marketing organization have a lot of mojo to contribute. Duo has become one of the fastest growing SaaS companies on the planet and they know a thing or two about making security easy. The company has a Net Promoter Score (NPS) of 68, one of the highest in the industry!

And that is the key to why this deal makes sense to me. The combination gives Duo scale. But it also injects speed and innovation into an industry that needs it. The old approach to security, locking down your network with a VPN and using old-fogey security tokens doesn't work when your applications are in the cloud, employees are mobile and hackers are targeting everyone and every thing. I believe Duo is well positioned to lead with a modern new approach to security.

There's also a fourth point, which in the long term could become even more significant. Duo's success injects a large amount of capital in the Ann Arbor / Detroit area. The company has also developed tremendous expertise in building a SaaS company at scale. That combination of capital and talent will result in the creation of additional startups in coming years. Duo's investors (Benchmark, GV, Index, Redpoint, Rennaissance, True Ventures...) did very well and are likely to be open to investing in new startups in the region alongside other firms focused on the midwest such as Drive Capital, eLab Ventures, Steve Case's Revolution, RPM Ventures and others. This acquisition shines a spotlight on Michigan's growing tech scene and that will have all kinds of positive impact on investment, innovation and job creation.

To all my friends at Duo, this is a vote of confidence in all that you have created. I wish you congratulations on achieving this milestone. Now there's an even bigger opportunity to take this product line, security expertise and company culture to a bigger audience than we ever thought possible.

Go Duo!  

 


Bumper Crop for IPOs in 2018

Zuora ipo

It looks like 2018 will be the strongest year in tech IPOs in recent history. Although some of the largest companies (Airbnb, Uber) are still waiting on the sidelines, so far we've seen a large number of very successful IPOs including DropBox, Zuora, ZScaler, Spotify and others. This week there were three strong IPOs including Ceridian, Docusign and Smartsheet which popped 30-42% in their debut. These companies all have much stronger fundamentals than some of the troubled IPOs of 2017 (Blue Apron, SnapChat) which gave investors pause. 

Pivotal is the only IPO that had a very modest rise on it's debut, a day when the markets were down overall. But in the following week, Pivotal has risen about 20%. DocuSign is a good example of a cloud company that has pursued a massive market opportunity. DocuSign has a $2 billion dollar run rate ($518m revenue in Q1, doubling over the last two years) with a quarterly loss of $52m, compared to $115m a year ago. While the company certainly could have gone public earlier, at a $6 billion market cap, the wait seems to have been worth it.

One of the factors fueling demand for new IPOs are the strong Q1 results among public tech companies including Amazon, Facebook, Microsoft and even Twitter.  Microsoft's stock has recently hit a historic high, based on the growth of its cloud business. Considering that Microsoft was a laggard in this space, it speaks to not only the disciplined management that CEO Satya Nadella has put in place, but also the huge upside that still exists for tech companies with recurring revenue SaaS offerings. 

And that's precisely why we should expect to see even more IPOs in the second half of 2018.  It looks like Acquia, Anaplan, Avast, CarbonBlack, Domo are likely to go out in the next few months. I would expect to see quite a few IPOs between now and mid-August when bankers head out on vacation and then more in the fall.

While there has been a lot of volatility in the market earlier this year, it has still been a nine-year bull market and at least in the tech sector, it seems that there is still a lot of headroom for growth.

What companies do you think will IPO in the rest of 2018? Will the bull market keep running? Let me know your thoughts by posting a comment below.

 


Are Tech IPOs Back in Fashion?

2017 ipos

Much has been made of the slowdown in tech IPOs in recent years, but that trend appears to be changing in 2017. Of course, there are a few mega-companies that continue to sit on the sidelines (AirBnB, Uber, DropBox, I'm looking at you!) but I think we will continue to see improvements in 2017 and 2018. Perhaps not as strong as the record number of IPOs of 2014, but likely enough to reverse the declining trend from 2015 and 2016.

Early this year we saw IPOs from the likes of Snap, Mulesoft, Aleryx, Okta, Cloudera among others. Other than Snap, which was rather over-hyped, most of the others had very good returns for their investors and are continuing to trade above their IPO price. And overall multiples for tech companies on NASDAQ and NYSE are holding steady. I'm especially encouraged by the performance of B2B software companies Mulesoft, Okta and Cloudera. Mulesoft now has a market cap over $3b and Cloudera and Okta look likely to cross that threshold later this year based on their steady growth and increasing efficiency. It looks like B2B stocks are once again in fashion.

My expectation is we'll see a bit of an IPO slowdown during the summer and then a significant uptick in the fall. For B2B SaaS companies getting to $100m or beyond in annual recurring revenue (ARR), this will be an interesting time.


Duo Security More Than Doubles in 2016

Duo-arr-growth-chart

I'm very proud to see how much Duo Security has continue to grow. In 2016 we more than doubled Annual Recurring Revenue (ARR) year over year, finishing at $73m and becoming cash-flow positive for the year. I've been a part of several high growth companies including MySQL and Zendesk, but Duo is the fastest growing and most efficient.

We've also announced our new Duo Beyond offering, which adds even more capabilities to go beyond traditional two-factor authentication. Hopefully 2017 continues to be an excellent year for SaaS security companies.


Hark: The Software Paradox

Hark

Stephen O'Grady at RedMonk has launched a new Podcast called Hark. In his second episode, he and Agile programming guru Kent Beck have a thoughtful discussion around the ideas in O'Grady's book "The Software Paradox."  Even though software is "eating the world" and become more widespread and strategic, its economic value appears to be declining rapidly. Certainly, we've seen a shift in the industry from traditional on-premise software commercialization to distribution models like open source, and software-as-a-service, with vastly different business models.

Simply put, the software industry is undergoing a significant disruption that is reshaping the economics of the industry and rendering older "tried and true" business models obsolete. And at a level that strikes closer to home for many, it's also reshaping employment models and careers. Although the parallels are not perfect, the software industry is going through a transformation much like the publishing industry or the music industry. (And we all know how well that turned out for writers and musicians!)

I would argue this has transformation has been going on for at least ten years already since the emergence of successful open source companies. In the early days of MySQL, Marten Mickos regularly talked about how his goal was to disrupt the database industry taking it from $9 billion in revenue to $3 billion, and then capturing a third of that. While this was possibly more bravado than business plan, it was based on the fact that MySQL was 90% cheaper than Oracle. (And for many, MySQL was 100% cheaper --after all, it was under a GPL license and free for most users.)

While we built a solid business with MySQL, growing it to just short of $100m in revenue and selling it to Sun for $1 billion cash in 2008, the long term impact of MySQL was far higher outside the database industry. MySQL, Linux and other open source infrastructure software spawned thousands of businesses that simply would not have been economically possible under traditional commercial licensing fees. We routinely met founders of companies that said their business was enabled in part because of the dramatically lower cost of building an IT infrastructure. So at least some of the value that MySQL disrupted was captured not by traditional software companies, but by newer companies like Facebook, Google, Skype, Craigslist, Priceline and the like. And many of those businesses also happened to be disruptive, which is why software is eating the world. 

Not surprisingly, there have been very few home runs in the open source business, at least as measured by revenues or exits. Red Hat, JBoss, Pentaho have all been successful as businesses and have had good payouts for their investors. But many more open source projects have had widespread popularity with remarkably little economic value generated. And that is precisely the nature of the Paradox. 

And as Mickos has recently noted "The bad news is: it's almost impossible to make money on open source. The good news: it has happened many times."  But at this point it's hard to say what the future successful models for software commercialization might be, but it's certainly not going to be the traditional on-premise up-front license model. And I don't think open source, in all its various forms, is likely to generate a large number of economic home runs. There are definitely a handful of promising companies like Acquia, CloudEraDataStax, MuleSoftPuppetLabsSugarCRM and the like, but they may be more the exception than the rule. (If I missed other rapidly growing open source companies, let me know in the comments below.)

Likely we will see more divergence over time with more value realized in other forms, whether it's service-based models, cloud-based businesses, advertising, data aggregation or perhaps something as-of-yet to be invented.

It's a fascinating topic with more questions than answers at this point.  And I'm sure we'll see more discussion on this topic at the Monktoberfest conference in October. 

The Hark podcast is available on iTunes, SoundCloud or wherever you get your downloads.