Peak Valuation?

It seems kind of crazy, but the tech industry seems to have achieved historic high valuations. A buddy of mine talked about the 100 club --software companies with 100x revenue valuations. Historically, SaaS companies have traded between 7-14x, with a few outliers like Salesforce beyond that. Now it seems many companies are trading at 25x, 50x and, you guessed it, outliers are at 100x! 

I can understand investor enthusiasm. When you see the robust growth from longstanding SaaS companies like Zendesk, Hubspot, Twilio, Shopify and younger companies like DataDog, Elastic and MongoDB it can appear that there's a magical combination of SaaS, developer tools and open source that will seemingly grow forever. Likely IPO candidates Gitlab and Databricks are doing just that. Many of these high-growth companies have been part of an overall platform shift to the cloud. With accelerated cloud adoption during 2020's covid quarantine, you might think there's no downside. 

But these companies are exceptional. As a reminder there's a crop of former high-flyers like Box and Cloudera that have seen their valuations come back down to earth after posting tepid growth. It's a good reminder that not every company that VCs invest is exceptional. Investors (and founders) should keep that in mind.

Hopefully whatever corrections happen are minor and we revert to healthy, rather than historic-high valuations.

What do you think? Are tech stocks overpriced?  Post a comment below.


Launching Delphi - Feb 14, 1995

Booth Delphi 1.0 launch Feb 1995It was twenty-five years ago today that Anders Hejlsberg and I got up on stage at the Software Development '95 conference and launched Delphi to the world. The joke was that all 1,500 of us were geeks who couldn't get a date even on Valentine's day. 

We knew Delphi was a good product. Maybe even a great one. Our beta testers loved it, the team was excited and we had a 32-bit version in the works for the upcoming Windows 95 OS that would be bigger, better and faster. But the scale of Delphi's success took us by surprise. The Borland booth was mobbed at the conference. Delphi, with the help of DavidI and Charlie Calvert, gave birth to an ecosystem of third party books, magazines, component libraries and more. I've met countless developers over the years who told me Delphi enabled them to learn Windows development, build their career, their business.

Anders Delphi 1.0 launch Feb 1995So what made Delphi so good? You gotta give credit to Anders. He is probably one of the ten best programmers in the world and certainly the best developer I've ever worked with. He had more than ten years of compiler experience under his belt when we built Delphi. He knew exactly what tradeoffs mattered in language design to balance programmer productivity with machine performance. Delphi compiled to machine code at the speed of 35,000 lines per minute on a 90 mhz pentium. I have no idea how fast that is on today's machine. But you could load a demo program, hit the run button and by the time you clapped your hands together it was running. And I clapped my hands together ever time I gave a demo, just to make the point. 

As Anders pointed out that night on stage, Delphi was written in Delphi. So the team that built Delphi (and it really was a team: Anders, Gary, Chuck, Dave, Allen, Hank, Ray, Marc, Danny, Charlie) used it every single day. We made it great because Delphi was the tool that we wanted to use. It was pretty mind-blowing when Anders loaded the Delphi project source code into Delphi and compiled itself. 

The Delphi project was not an easy one though. It came at a tough time in Borland's history. The company was sued by Lotus in 1990, acquired Ashton Tate in 1991. By 1993, the company essentially sold off Quattro Pro and Paradox to Novell after Microsoft decimated the standalone spreadsheet and end-user database market. Oh yeah, and the founder and CEO, Philippe Kahn left to create Starfish Software a month before we launched. Philippe helped protect Delphi as a skunkworks project when we started and he coined the codename VBK (ahem) which none of us liked, but all of us believed in. 

We knew if Borland was to stay relevant in developer tools, we needed to build something better than Visual Basic. We never saw Delphi as VB Killer, but certainly a VB Kompetitor. How would we compete with that behemoth? Well, we weren't cocky, but we also weren't afraid of Microsoft. We had to make Windows programming easy enough that a DOS programmer could do it. And in that regard, our prior efforts with Turbo Pascal 7, missed the mark. Borland had a couple of other internal efforts that never saw light of day (Monet, anyone?) and at some point, Gary, Anders and I came to the realization, someone had to make it happen, and that someone was us. Having a native code compiler meant that Delphi would have a huge performance advantage over interpreters. It also meant Delphi developers would be able to create their own reusable objects without having to learn a different language. That gave us huge extensibility. 

We also learned there was another change on the horizon and that became our opportunity. Borland VP Rob Dickerson had highlighted the need for the company to build a client/server development system. Again, we looked around and we realized Paradox wasn't going to do it, dBase wasn't good enough, C++ was too hard. And so I put up my hand and convinced Gary and Anders not only did we need to make Windows development easy, we had to take on Client/Server development at the same time. Luckily they agreed, not knowing what Client/Server development meant. I didn't either, but I trusted we would figure it out. Ultimately this became our biggest differentiator in the market. While our performance over VB could be 2-3x faster, compared to SQL Windows or PowerBuilder, Delphi was 5-50x faster, and sometimes 800x faster. 

When we first started, we thought the project might take a year, but that Client/Server stuff was a lot harder than we expected. One of the developers working on that area eventually left the company and when Chuck and Anders looked at his code they just about barfed. That cost us about six months. I'm pretty sure every single person working on the project came to see me and said: "Can't we forget that Client/Server thing and just ship the desktop Windows version?" But my answer was always the same. I drew a curve of what Delphi desktop revenues would be. Then I drew a second line for Client/Server below the first one but growing at a steeper angle, eventually eclipsing the desktop revenues. I don't know if anyone believed me (and I honestly didn't know if I believed it myself) but it put an end to the discussion.  

Zack Delphi 1.0 launch Feb 1995I knew that the Client/Server product was more important strategically for the company because it would expand our market beyond Borland's traditional base. Ironically, at some point my boss VP Paul Gross asked why we were working on the desktop product, suggesting we skip that completely. I told him Delphi desktop revenues would be $30 million in the first year (a number I made up on the spot) and he nodded and said "good point."  

Delphi's first year revenues were $70 million (far higher than we'd expected) and grew from there. That's about $118 million, adjusted for inflation. And the Client/Server revenues really did eclipse the desktop revenues in the second year. To say Delphi saved Borland was not an overstatement. 

We also made a good bet on shipping a 16-bit version of Delphi first, rather than jumping straight to 32-bit. It was a safe assumption that Microsoft would slip Chicago (Windows 95). So we had a stable 16-bit compiler and operating system and could work on that without having to worry about the ground moving beneath our feet. We were fortunate to get the 32-bit compiler under development in parallel, shipping it just about 12 months later as Windows 95 was gaining market share. Delphi 2.0 boosted performance another 3-4x giving us an even bigger lead.

When we built Delphi we never thought it would last so long or have as much impact as it did. We were grateful for the support and feedback from our customers and third party developers. While we weren't obsessed with press coverage and awards, we were happy that it helped get the word out. I still have the Jolt Cola award on my bookcase. I figured if Delphi lasted to version 3.0, that meant we did a good job. But twenty-five years? Who could have guessed?

Looking back on Delphi 1.0, much of those two years is a blur of sixty hour weeks, late evenings and occasional setbacks. But the memories that stand out were about the team. We were committed to building something great, something that we would use. Gary and Anders (and Chuck, and Danny...) all had great taste. So there was a kind of aesthetic to the product. It's hard to explain, but we knew it as "it works the way you hope it would." Delphi wasn't just fast, it avoided the limitations of many Rapid Application Development (RAD) tools that ran out of gas when you pushed hard. 

I've done a lot of interesting things in the last twenty-five years, but Delphi is the product I'm most proud of. It was a magical time in our lives when we were experienced enough to do good work and young and foolish enough to bite off more than we could chew. We solved some hard problems that mattered in a market that we understood and the market responded. It shaped my thinking about how to build products in ways that I continue to use and teach to this day.

I'm grateful to Anders and Gary that we took on the project. Gary is the best engineering manager I have ever worked with and I was glad to get to work with him again at MySQL. Anders, of course, has gone on to do even greater things architecting C#, .Net and TypeScript. I'm proud of the many developers, writers, and testers, product managers and marketers (Lance, Diane, Ben, you were awesome), who built on the early success of Delphi 1.0 to create a legacy that has withstood the test of time.

And thank God we finally got that darned Language Reference Manual out. 

Zack gary anders 2011

Got a recollection of Delphi 1.0 or a story about Anders, Gary or me? Post a comment below...


.EXE Interview with Anders Hejlsberg on Delphi (1995)

To commemorate the 25th anniversary of Delphi on Feb 14, 2002 here is a transcript of an interview with Anders Hejlsberg, Chief Architect of Delphi conducted by .EXE Magazine editor Will Watts from 1995. Anders discusses the design and development of Delphi and the then forthcoming 32 bit version for Windows 95. This was the most detailed technical interview published about Delphi at the time.

Q. How did the idea for Delphi evolve from Turbo/Borland Pascal? At what stage did you decide to add the environment, database support etc?

Delphi andersA. The key idea was to design a tool that combines visual development environment, Client/Server database support, and a native code compiler. Before Delphi, you always had to make a choice. Do I go for the performance of a native code compiler, or the ease of use of a visual development environment? Do I go for a powerful object-oriented language, or a proprietary 4GL Client/Server tool? What programmers really want is all of the above, in one package. That's what we set out to do with Delphi.

What it really boils down to is productivity--we wanted to design a tool that would make developers more productive, all the way from prototype to production code. Other products lure you with visual tools, but once you get halfway through your project, they let you down because of sluggish performance, lack of extensibility, or general stability problems. The competition talks about adding extensibility and improving performance. That's a fundamental difference between their products and ours. Extensibility and performance was on the white-board the first day we started designing Delphi, and it permeates the entire product. For example, if you want to design a new component in Visual Basic, you have to write it in another language, such as C or C++ (or Delphi, for that matter). None of your VB skills can be reused, you have to learn a different language, and you can't easily inherit from any of the built-in components. Delphi, on the other hand, allows you to write new components in Delphi, and you can inherit from any of the built-in ones. That's true extensibility, and it translates into a substantial productivity boost.

Another key aspect of Delphi is its versatility. Other tools tend to focus either on Windows application development or on Client/Server development, and one always trades off the other. Delphi is equally adept at both, as is evident from the kinds of applications our customers are building. They range from shrink-wrap Windows utilities and multi-media games, through desktop database applications, and all the way up to multi-user enterprise-wide Client/Server solutions. The point is that almost any Windows application needs some form of database access, and any database application needs some form of Windows specific programming--to be productive, you need a tool that does both.

Delphi really leverages a lot of very mature database technology from Borland including ReportSmith, the Borland Database Engine, SQL Link native drivers for remove servers and the Local InterBase Server. Just the InterBase server alone is a tremendous technology that gives developers the ability to use full ANSI-92 SQL in their applications so they can begin exploring SQL and client/server development all on their local PC.

Q. You emphasise Delphi's versatility as an advantage, but surely it is also a drawback? If one needs to build a client/server application, PowerBuilder offers better CASE/database management facilities than Delphi.

A. There's an inherent advantage to being versatile. Look at the computer on my desktop. Do I need a dedicated word processor, a PC for my spreadsheets and a terminal with access to my customer records? No, I've got one PC that's versatile enough to do all these things.

A very large American retail chain--one of the largest--just standardized on Delphi over PowerBuilder precisely because their engineers can do 85% of all their work using Delphi versus 60% of their work using PowerBuilder. That saves them enormous amounts of money and complexity, including in ways you may not have considered. As an example, skills and techniques learned writing a small utility are directly applicable to client/server projects. A lot of today's programmers started out by writing those little command-line utilities in the good old days. It's a great way to experiment with and master the use of data structures, object-oriented techniques or learning about the Windows API. Consider, also, how using the same tool for a broad range of applications provides a company with a neat training path: Someone can start writing non-database programs and then gradually move onto projects dealing with valuable corporate data.

There's no end to the components and views you can add to Delphi. The population of programmers who can build components in Delphi is much larger than with any other tool on the market. We're back to the days when one programmer in one room can build and test something that can be used by tens of thousands of other people. Can you imagine what the
availability of specialized component sets will be like in six months? In a year?

I think the entire point of combining a component-based visual development environment with an object-oriented compiler and database technology is to make sure you never run out of gas. That's not a bug--it's a feature.

Q. If you want a quick and dirty hack, surely it makes sense to use Visual Basic, because everybody can use it without having to master a scary, complex language like Pascal. If you are doing multi-media or real time work, why mess around with a system which delivers slightly slower performance, and requires you to hand-translate all the header files for any DLLs you may need, when you could just use C++?

A. As we like to say, "It's not your father's Turbo Pascal any more". We made sure that the Object Pascal code you have to write is as easy as BASIC but without limitations.

We've taken great pains to make sure that when you're interacting with components, the code you write is as simple as possible--but no simpler. Many reviewers have remarked that they thought they were coding in Basic when they first started using Delphi. It's that easy. When they want to do something more interesting and start using the richness of the language, they usually start remembering how much they like Pascal.

In fact, I think you miss an essential advantage of Delphi. Anybody who has used a compiler--especially one that supports good type-checking--knows that a compiler is really a programmer's best friend. When it tells you it's probably not a good idea to take the square root of your Window caption, it's showing you a logic error in your code and saving you time. Is it an advantage that BASIC will perform automatic type conversions in that circumstance instead of giving you an error? I wish my spell checker program could complain about the logic of a paragraph I've written in the same way as our compiler warns you about illogical programming statements. Our 32-bit compiler goes even further and offers you all sorts of hints about problems it detects in your program. This kind of help is invaluable and one of the things that makes programming in Delphi very productive.

Q. What is the secret of Delphi's fast compile/link cycle?

A. Borland has over ten years of experience in building the world's fastest compilers, and we've put that knowledge to good use in Delphi- -it compiles at about 350,000 lines per minute on a 90 Mhz Pentium. A number of factors comtribute to this throughput. Delphi units (code modules) compile to .DCU files, which you can think of as a combination of a C++ precompiled header file and an .OBJ file. (It's funny how the hot topic in the C++ community is pre-compiled header files and incremental linking--Borland's Object Pascal technology has had these features for more than eight years.) Delphi units specify what other units they depend on through USES clauses--sort of like C++ #include's of header files. By analyzing the USES clauses of each unit in a project, the compiler can automatically perform minimal builds with no need for a make file. The net result is that the compiler never compiles more than it has to, and it never compiles the same thing more than once. Finally, the clean syntax of Object Pascal allows for very fast parsing.

Q. Is the compiler engine itself written in Delphi? How much does it differ from the Borland Pascal 7 compiler?

A. The compiler is written in assembly language. It is fully backwards compatible with BP7, and we've added lots of object-oriented extensions such as class references, virtual constructors, and the IS and AS operators. We did a lot of work to enable declaring, registering and filing properties and we generate run-time type information that's used to communicate published property, event and method information to the development environment. You'll see some interesting applications of that capability in our 32-bit release. One very unique enhancement was our use of bound method instance pointers to implement event delegation. They're very efficient and fit nicely into the language. And of course we did a lot of work to add structured exception handling. In addition, there are lots of little niceties that people have requested, such as support for C calling conventions.

Q. Delphi implements objects in a manner similar to Apple's Object Pascal, with all objects allocated on the heap. Previous versions of Turbo/Borland Pascal used a more C++ like approach, with the ability to alocate objects on the stack and statically. Can you explain the reasoning behind this change in approach?

A. It really is a question of features vs. complexity. The philosophy of Delphi's Object Pascal language is to deliver the RIGHT set of language features, as opposed to any language feature ever known to mankind. It's the well known 80/20 rule: You can get 80% of the power for 20% of the complexity, but squeezing out that last 20% of power makes the whole thing five times as complex to program. Mixing static and dynamic allocation of objects is one of those features that fall into the latter group. By implementing a pure reference model we were able to simplify the entire Delphi component library, and do away with a lot of the pointer management that plagues other products. Even though Delphi objects are allocated on the heap, in a typical Delphi application you never have to deal with allocating and freeing them.

Q. I find this answer quite surprising and counter-intuitive. You had already implemented mixed static/dynamic allocation, and therefore presumably cracked the problems involved, so why go to the trouble to revert to the Apple Object Pascal approach which you had initially rejected? Is, say, a stack allocated object, with constructors and destructors automatically called as the thing moves in and out of scope, really more complex than a heap allocated object, where you must make special provision to kill the thing off at the end of its life? I would have thought that the fact that the component library *mostly* frees objects automatically but *sometimes* doesn't would tend to add to rather than reduce the application programmer's burden. Also, the change in model must confuse both existing BP programmers and also migrating C++ users.

A. Again, we didn't revert to anything because we really started with a clean slate. Our class reference model is sufficiently powerful and flexible, so having only one sort of class is actually an advantage. Once you give someone two ways to do the same thing, you have made your product less usable and you have to now help them understand when to use a statically allocated class versus a dynamically allocated one. We're quite happy with the choice we've made. It's simple to understand, efficient, and allows us to add garbage collection in some future release. And, of course, if you've got old code from BP7 that uses old style objects, you can still compile it from within Delphi.

Q. Exception-handling - what were the major influences on your design?

A. We looked at a number of languages and implementations, and were most influenced by C++ and Modula-3. Delphi is like C++ in that exceptions are classes, but more like Modula-3 in terms of the supporting language constructs.

Exceptions are a quiet revolution--they truly simplify the way you write code. For the most part you can write your code as if errors will never occur, instead of spending the bulk of your time trying to determine if an error occurred, and if so, how best to clean up and back out of what you were doing. Delphi's Visual Component Library was designed from the ground up with exception handling built in, and that is a large part of the reason why Delphi and applications written in Delphi are so fault tolerant. One of my favorite demos is a little two-liner that, on the click of a button, assigns NIL to a pointer, and then dereferences the pointer. Each time you click the button, Delphi reports that a General Protection Fault exception has occurred, but because of the built-in exception handling logic, the app keeps running instead of bringing itself down.

Q. I'd like to draw you out a bit to expand the answer above with a few specifics.

A. As in C++, an exception in Delphi is simply a class, which means you can take advantage of the inheritance mechanism to handle whole sets of exceptions easily. For example, Delphi declares the following classes which deal with floating-point exceptions:

type
 EMathError = class(Exception);
 EInvalidOp = class(EMathError);
 EZeroDivide = class(EMathError);
 EOverflow = class(EMathError);
 EUnderflow = class(EMathError);

As you can see, EMathError is the ancestor of the other exceptions. Here's an example of a TRY..EXCEPT statement that handles floating-point exceptions 

try
 PerformCalculations;
except
 on EZeroDivide do ...;
 on EMathError do ...;
end;

If the PerformCalculations procedure raises an EZeroDivide exception, it is handled by the first handler. If it raises any other EMathError exception, the second handler takes care of it. Since there is no ELSE clause, no other exceptions are handled--they are instead propagated to an enclosing exception handler.

Q. Delphi's ability to handle GP faults is indeed one of it's neatest tricks. Was it difficult to implement?

A. It wasn't too bad, but it did take some nitfy use of TOOLHELP.DLL which implements the Windows low-level system tools interface. We basically register an interrupt callback function which maps processor faults into Delphi exceptions. The reason that it all works, though, is that VCL was engineered from the ground up to be exception aware. Because
of that, when a GP fault occurs and is mapped into an exception, the operation that was in progress will automatically know how to back out and clean itself up.

Q. Can we expect any other major syntax additions/changes, for example Eiffel style assertions?

A. We're always evaluating new language features, and surely there will be some in the upcoming 32-bit version. I'd rather not get into specifics, but as a rule, we don't really think about language extensions in the abstract. Instead we look at the language as part of a bigger picture (class library, component model, visual environment) that must evolve as a whole to support new technologies and improve ease of use.

Q. Can you give Delphi programmer's any guidance on how best to write applications that will be portable to the 32-bit version of Delphi? The new "Cardinal" data type has arrived almost completely unnoticed. Are there any other issues we should be aware of?

A. Delphi's Visual Component Library was designed with portability in mind. As long as you stay away from in-line assembler, 16-bit pointer atrithmetic, and Windows 3.1 API functions which aren't supported in the Win32 API, your apps should port with little or no modification.

The Cardinal and Smallint types were introduced to facilitate portable code. Of the built-in types, Shortint, Smallint, Longint, Byte, and Word have identical representations in 16- and 32-bit code. The Integer and Cardinal types, on the other hand, represent the most efficient signed and unsigned integer types of the particular platform. In the 16-bit version they are 16-bit entities, and in the 32-bit version they are 32-bit entities. In general, you should use Integer and Cardinal whenever possible, and Shortint, Smallint, Longint, Byte, and Word only when the exact storage representation matters.

Any 64K limitations found in the 16-bit version will disappear in the 32-bit version. For example, the 32-bit version allows you to declare arrays and allocate heap blocks of any size up to 4GB!

Q. What is the current state of the 32-bit version? Will it support 16-bit VBXs, like BC++? Delphi 16-bit code runs somewhat slower than C++ - are you doing anything about this for the 32-bit version?

A. Delphi was written to be portable--we've been working on the 16- and 32-bit versions in parallel since day one. The 32-bit version is in field test now, and it will ship shortly after the commercial release of Windows 95. Yes, there is a foundation of 32-bit VBX support technology available in-house, but our primary focus is OCX controls. That's what the competition is working on, and that's where we see the market going. With respect to better code generation, Delphi-32 generates the same high-quality code as Borland C++ 4.5--in fact, they use the same optimizing back-end code generator.

Q. Are there any plans for Borland produced or badged add-ons for Delphi, in addition to the Visual Solutions Pack?

A. We just released the RAD Pack for Delphi, which includes Turbo Debugger for Windows, Resource Workshop, the Resource Expert, Visual Component Library source code, the--much requested--Language Reference Manual, and Visual Solutions Pack 1.1. We did have some quality problems with the initial release of VSP, but those have been resolved, and we now have a Companion Products group to provide Borland-quality add-ons, such as Notes support for Delphi programmers and other often requested components.

Q. Delphi is a terrific tool for rapidly developing state of the art software, but a number of shareware authors have expressed a wish that executables could be made smaller. Is it technically feasible to create a DLL-based version of VCL? Surely this must be possible since COMPLIB is a DLL which is used by the Delphi design environment?

A. It's something we're looking at, and certainly some of the 16-bit complexities with respect to multiple DLL clients are gone in 32-bit land. At this point I can't really comment on specific solutions, other than to say that we're actively looking at ways to make our executables even smaller.

Q. A long-standing and major criticism of Borland Pascal is the proprietary nature of the object file format. It's appreciated that going to the OBJ file format would be a retrograde step, but why won't Borland at least document the file format? That way, developers can create their own tools such as disassemblers, C to Pascal linkers and so forth. Again, it's understood that the file format changes with each release of the compiler, but documenting the changes with each new version would enable other developers to create conversion tools even if Borland don't want to do this. At the moment, if you don't have the source code, all your units become useless each time the compiler is updated.

A. We're well aware of these issues, and the 32-bit version will address them in a number of ways. What I can tell you at this point is that the 32-bit compiler has an option to produce
.OBJ files, which can be linked with .OBJ files produced by other compilers.

Q. A related issue: the move to Windows has diluted the importance of the OBJ issue, because you can now call DLLs. But the Delphi user has still to translate the (typically) C/C++ headers into Delphi import units, an exercise which is at best tedious and time consuming and, if you happen not to have had C++ experience, quite hard. It's the sort of job best left to a machine. Given that Borland has a lot of C++ parsing expertise lying around on the ground, have there ever been any plans to create such a tool?

A. Well, I'm not sure which C/C++ headers you're talking about.
We've already translated all the Windows and OLE 2 API header files, and corresponding interface units are included with Delphi. But you're right, if you have a 3rd party DLL that was previously only interfaced to C/C++, somebody will have to do the translation. Usually, it's not that bad and I think you'll see an increasing number of vendors providing Delphi interface files for their DLLs. Also, I think you'll see more and more products take advantage of the OLE 2 ITypeLib and ITypeInfo interfaces, and we'll provide a tool that takes that information and produces a Delphi interface unit.

Q. The ability to create a single EXE for redistribution is very attractive, but somewhat spoilt by the need to include the BDE with database applications, even if they only want to access the odd DBF. Any plans to clean this up?

A. We're working with several third parties, including SAX Software, Eschelon Development, Sterling Software, Great Lakes Software, and Shoreline Software. They have, or will soon have, products to help you deploy your Delphi database applications. In addition, we're making a deployment kit available, via CompuServe and Internet.

Q. Also on data access: is it possible to modify/inherit from the data access controls to provide, for example, 'native' access to FoxPro/Clipper databases? If so, are any such products being developed by Borland or Third Parties?

A. I know of several Third Parties working on native access to FoxPro/Clipper as well as B-Trieve. Some of them are in beta at this point. You can contact them for additional information on the DELPHI CompuServe forum or find out about them in the Delphi "Power Tools" catalog.

Just to clarify: Sax, Eschalon, Sterling etc do install/setup tools that work with Delphi and the Borland Database Engine. There are other companies creating components that provide direct access to BTrieve, Fox, and other database formats.

Q. What is Delphi's main niche in the developer tools market? Compared with, say, Visual Basic, PowerBuilder and C++?

A. Delphi is a general purpose Windows RAD (Rapid Application Development) tool. The point is that Delphi is NOT a niche tool. From the onset, we've designed Delphi to be able to take you from prototype to production, be that whether you're targetting a Client/Server environment or just writing a Windows application. I hear our competitors say that you can use their tool for rapid prototyping, and then port your app to C++ for production. But you know, rapid application development isn't really rapid unless you can go from prototype to production, all using the same tool! I also hear how competing products will address performance issues by generating C or C++ source code. This idea of building the application with one tool, and then having it generate C or C++ source files that have to be run through another tool, is ludicrous. How are you expected to debug the final code? Requiring users to find bugs in machine generated C++ code, and understand how that maps to the original 4GL code, just doesn't make sense. We've been shipping development environments with integrated compilers for 12 years--I think the time is gone when programmers would accept anything else.

Q. How have Borland's recent troubles affected Delphi's development and its take-up? Was the absence of a Language Reference Manual in the initial product a consequence of these troubles? (You had to mention the Language Reference Manual?  My mistake, I'm sorry! --Zack)

A. In the two years we were developing Delphi, the company did go through some difficult times. That was all resolved before we shipped. Now the entire company is focused on development tools, we've won the Lotus lawsuit, we've launched Delphi and Delphi Client/Server worldwide, and both products continue to sell well above expectations. In fact, I understand that Gray Matter reports that Delphi is the #1 selling development tool in the UK. The Delphi development team is 100% intact, and focused on Delphi for Windows 95.

We really underestimated the demand for the Language Reference Manual. (You're telling me! --Zack)  It will be included in Delphi for Windows 95. Meanwhile, we've made good by uploading an Adobe Acrobat version to CompuServe and our WWW page, and a printed version is now also available from Borland.

Q. We are all pleased you resisted the opportunity to christen a product Power Visual Turbo Pascal Objects for Windows - but how did it come to be called `Delphi'?

A. We actually tried to call it "Power Visual Turbo Pascal Objects for Windows", but that name was already trademarked :-). One of the senior guys in QA (Danny Thorpe) dreamed up Delphi as a code name quite early on, and everytime we did a market survey of product name candidates, everyone said "well, those are ok, but we really like `Delphi'". So in the end, we kept it.

Q. Which part of Delphi are you most proud of? ... and which part least?

A. The thing I'm the least proud of is probably the initial lack of a Language Reference Manual. But that's taken care of now. (Dammit, I said I was sorry already! --Zack)

What I'm most proud of is the fact that the energy we invested in foundation technologies like extensibility and exception handling enabled us to build Delphi in itself. Can
you imagine VB or PowerBuilder written in themselves? By building Delphi in Delphi, we really got to feel on our own bodies what was right about the product, and what needed fixing. I sometimes hear frustrated users comment "The programmers that wrote this %&##$ thing should be forced to use it themselves!". Well, we did, and we're really proud of the result.

-------- CHRONOLOGY --------

  • 1960 Anders born in Copenhagen, Denmark.
  • 1979 Enrolls at the Danish Engineering Academy. Co-founds PolyData, one of the first Danish microcomputer companies.
  • 1980 Releases his first Pascal compiler--a 12K Pascal subset in ROM for the British NASCOM Z-80 based kit computer. Eventually sells the rights to this product to Lucas Logic.
  • 1982 PolyPascal for CP/M-80 released. Product is now a complete implementation of the Pascal language.
  • 1983 Sells the Borland founders (Niels Alex Jensen, Ole Henriksen, Mogens Glad, and Philippe Kahn) on the idea of a Pascal compiler with an integrated editor. In November releases Turbo Pascal 1.0 for CP/M-80, CP/M-86, and MS-DOS. The newly formed Borland company, essentially penniless, places an advert for Turbo Pascal in Byte, bluffing the Byte Ad executives into giving them credit. The compiler is priced at $49.95 and is an instant smash hit.
  • 1986 Turbo Pascal 4.0 released, featuring an Integrated Devlopment Environment--the first of its kind for the PC environment--and introducing modular compilation (previously Turbo Pascal programs had to be compiled all in one go and could be no larger than 64K unless overlays were used). CP/M support is dropped.
  • 1988 Turbo Pascal 5.0 released, featuring integrated debugging and VROOMM (Virtual Runtime Object Oriented Memory Manager) overlay management technology.
  • 1989 In response to Microsoft's object oriented QuickPascal, Borland releases Turbo Pascal 5.5, which has its own OOP extensions. Microsoft later drops QuickPascal from its product line.
  • 1990 Turbo Pascal 6.0 features a new, much improved Integrated Development Environment, and includes the Turbo Vision object-oriented application framework.
  • 1991 First release of Turbo Pascal for Windows. Features a Windows hosted IDE and the ObjectWindows Library (also known as OWL).
  • 1992 Borland Pascal 7.0 includes both a DOS and a Windows hosted IDE, and allows developers to target DOS, DOS Protected Mode, and Windows.
  • 1995 Delphi and Delphi Client/Server released on schedule on February 14th.

--------

This interview originally appeared in EXE, the UK's leading programming magazine. If someone can send me a .EXE cover or logo, that is appreciated!
Thanks to Ben Riga for sending it to me. Thanks for also not mentioning the Language Reference Manual.


General Magic

General magic 400

I realize I'm late to the party on this, but I happened to catch the documentary "General Magic" recently and it's an amazing piece of work. It had been on my list of films to watch since earl 2019, but somehow never bubbled up to the top. So it was fortunate that it was available on a Delta flight I had recently. For anyone who wants to understand the ups and downs of what it's like to be in a high-tech startup, this is an in-the-trenches documentary that captures startup reality better than anything else I've seen.

The film documents the rise (and fall) of Apple spinoff General Magic, a company that embarked upon the audacious goal of producing the first connected, handheld personal digital assistant (PDA), predating such devices as the HP 95 LX, Psion 3, Palm Pilot. General Magic was founded in 1989 by Marc Porat, Andy Hertzfeld and Bill Atkinson, the latter two among Apple's most famous engineers for their work on the original Apple Macintosh. The company became a hotbed of innovation and secrecy. No one quite knew what they were up to, but given the pedigree, it was going to be big. The company attracted dozens of other prominent engineers and up and comers including Susan Kare who designed the UX, Tony Fadell who went on to help create the iPod, the iPhone and Nest, Andy Rubin who created Android, Pierre Omidyar who created eBay.

General magic prototypeThe documentary includes a large quantity of historic footage. The team at General Magic knew they were working on something important and they hired a team to film all of it. So there are team meetings (bean bags on the floor), demos, late night sessions, press conferences, nerf-gun fights and more. The historic footage is interspersed with contemporary interviews with key executives, press and analysts looking back on what they accomplished and why the company failed. 

It's a heartbreaking story. Here's a company with vision, financial backing, genius engineers, and a huge market opportunity and its obliterated to the point of obscurity. I doubt that today's startup engineers have even heard of General Magic, the Magic Cap operating system or any of the tools from that era. But the irony is that all of the technology became widespread within 20 years. That includes technology that later surfaced in the iPhone, Android, Twitter, Amazon. Ultimately, the people behind General Magic created multiple billion dollar technologies and arguably entirely new industries. It was the hard lessons of General Magic's failure that ultimately resulted in the triumph of its many brilliant engineers.

Having worked in the software industry since the late '80s, this film captures the essence of Silicon Valley better than anything else I've seen. It's a powerful tribute to what it takes to succeed in the valley and asks the question: is it worth it? 

Here's the trailer:

 


Congrats to Duo Security!

Zack Duo Cisco 1
Congratulations to Duo Security, which announced that it is to be acquired by Cisco Systems for $2.35b. This is a great outcome for all involved, and I'm very proud of what the team has accomplished.

I worked with Duo for about three years, initially as an advisor and ultimately as Chief Operating Officer running Sales, Marketing, Products, Engineering and Services. I helped grow the company from around $7m in Annual Recurring Revenue (ARR) to about $100m. The company has continued to grow to 700 employees, 12,000 customers and revenues that I estimate could exceed $200m ARR by year end, based on prior published numbers.

Duo is the fastest growing company I've been a part of; faster even than Zendesk or MySQL. When a company grows this quickly, it becomes a different organization every year. The early sub-$10m revenue company is quite different from where Duo is today. I would always tell new employees to be prepared for change. What remained constant was an underlying set of midwestern values of hard work, customer care and innovation that made Duo special. (Also we had a really good fun band called "Louder Than Necessary.")

It's a testament to the founders' vision and the management skills of the leaders we recruited that the company scaled so well. I remain especially proud of the many people we hired, promoted and developed to become the future leaders in the company. As news of the acquisition came out, many people have asked me about the deal, so here are my thoughts...

First of all, this should be recognized as an absolute success for the management team. To grow a company to this size and value is very rare. Less than 1% of venture-backed companies get to a valuation of $1 billion. It's also one of the biggest software successes in the midwest and proves that you don't have to be in Silicon Valley to win big. (Duo has in fact expanded into San Mateo, Austin, London and Detroit. Part of Duo's success is due to these multiple locations, but that's a another story.) 

Secondly, this deal creates a larger force in the industry. There is no doubt that Duo could have proceeded towards an IPO in 2019; they had in fact hired several new executives to lead these efforts in recent months. But the combination of Cisco and Duo together is more significant than Duo on its own. There has been a large amount of consolidation in the security space in the last few years and I believe Cisco will emerge as a leader. They have the respect and attention of Global 2000 CISOs and CIOs, a strong worldwide sales machine and a large number of related products. In a few years time, it will be clear that Microsoft isn't the only company that is capable of reinvention. 

Duo-arr-growth-chartThirdly, this represents an opportunity for further growth for the team at Duo. Cisco plays on a larger global stage than Duo could on its own. But Duo's executives, managers, engineers, security experts, sales people, support engineers, product team and marketing organization have a lot of mojo to contribute. Duo has become one of the fastest growing SaaS companies on the planet and they know a thing or two about making security easy. The company has a Net Promoter Score (NPS) of 68, one of the highest in the industry!

And that is the key to why this deal makes sense to me. The combination gives Duo scale. But it also injects speed and innovation into an industry that needs it. The old approach to security, locking down your network with a VPN and using old-fogey security tokens doesn't work when your applications are in the cloud, employees are mobile and hackers are targeting everyone and every thing. I believe Duo is well positioned to lead with a modern new approach to security.

There's also a fourth point, which in the long term could become even more significant. Duo's success injects a large amount of capital in the Ann Arbor / Detroit area. The company has also developed tremendous expertise in building a SaaS company at scale. That combination of capital and talent will result in the creation of additional startups in coming years. Duo's investors (Benchmark, GV, Index, Redpoint, Rennaissance, True Ventures...) did very well and are likely to be open to investing in new startups in the region alongside other firms focused on the midwest such as Drive Capital, eLab Ventures, Steve Case's Revolution, RPM Ventures and others. This acquisition shines a spotlight on Michigan's growing tech scene and that will have all kinds of positive impact on investment, innovation and job creation.

To all my friends at Duo, this is a vote of confidence in all that you have created. I wish you congratulations on achieving this milestone. Now there's an even bigger opportunity to take this product line, security expertise and company culture to a bigger audience than we ever thought possible.

Go Duo!  

 


Bumper Crop for IPOs in 2018

Zuora ipo

It looks like 2018 will be the strongest year in tech IPOs in recent history. Although some of the largest companies (Airbnb, Uber) are still waiting on the sidelines, so far we've seen a large number of very successful IPOs including DropBox, Zuora, ZScaler, Spotify and others. This week there were three strong IPOs including Ceridian, Docusign and Smartsheet which popped 30-42% in their debut. These companies all have much stronger fundamentals than some of the troubled IPOs of 2017 (Blue Apron, SnapChat) which gave investors pause. 

Pivotal is the only IPO that had a very modest rise on it's debut, a day when the markets were down overall. But in the following week, Pivotal has risen about 20%. DocuSign is a good example of a cloud company that has pursued a massive market opportunity. DocuSign has a $2 billion dollar run rate ($518m revenue in Q1, doubling over the last two years) with a quarterly loss of $52m, compared to $115m a year ago. While the company certainly could have gone public earlier, at a $6 billion market cap, the wait seems to have been worth it.

One of the factors fueling demand for new IPOs are the strong Q1 results among public tech companies including Amazon, Facebook, Microsoft and even Twitter.  Microsoft's stock has recently hit a historic high, based on the growth of its cloud business. Considering that Microsoft was a laggard in this space, it speaks to not only the disciplined management that CEO Satya Nadella has put in place, but also the huge upside that still exists for tech companies with recurring revenue SaaS offerings. 

And that's precisely why we should expect to see even more IPOs in the second half of 2018.  It looks like Acquia, Anaplan, Avast, CarbonBlack, Domo are likely to go out in the next few months. I would expect to see quite a few IPOs between now and mid-August when bankers head out on vacation and then more in the fall.

While there has been a lot of volatility in the market earlier this year, it has still been a nine-year bull market and at least in the tech sector, it seems that there is still a lot of headroom for growth.

What companies do you think will IPO in the rest of 2018? Will the bull market keep running? Let me know your thoughts by posting a comment below.

 


Are Tech IPOs Back in Fashion?

2017 ipos

Much has been made of the slowdown in tech IPOs in recent years, but that trend appears to be changing in 2017. Of course, there are a few mega-companies that continue to sit on the sidelines (AirBnB, Uber, DropBox, I'm looking at you!) but I think we will continue to see improvements in 2017 and 2018. Perhaps not as strong as the record number of IPOs of 2014, but likely enough to reverse the declining trend from 2015 and 2016.

Early this year we saw IPOs from the likes of Snap, Mulesoft, Aleryx, Okta, Cloudera among others. Other than Snap, which was rather over-hyped, most of the others had very good returns for their investors and are continuing to trade above their IPO price. And overall multiples for tech companies on NASDAQ and NYSE are holding steady. I'm especially encouraged by the performance of B2B software companies Mulesoft, Okta and Cloudera. Mulesoft now has a market cap over $3b and Cloudera and Okta look likely to cross that threshold later this year based on their steady growth and increasing efficiency. It looks like B2B stocks are once again in fashion.

My expectation is we'll see a bit of an IPO slowdown during the summer and then a significant uptick in the fall. For B2B SaaS companies getting to $100m or beyond in annual recurring revenue (ARR), this will be an interesting time.


2016: Down Rounds & Layoffs

Nasdaq - Box Twtr Etsy APIC

If you thought 2015 was a rough year for the financial markets, you ain't seen nothin' yet. So far, 2016 has every sign of being a full-on bear market, meaning a 20% or more decline in the major stock markets. 

And no surprise, we've seen a ton of bad news for so-called Unicorns --tech companies valued at over a billion dollars. Unfortunately, many of these companies have failed to build an efficient, profitable business to maintain their lofty valuations. No surprise, companies are seeing their public and private valuations dramatically reduced. Here are a stories that have surfaced in recent months:

The point of all this bad news is that it's not just about one or two companies that have missed the mark. It's a systemic problem. And likely, we are still just seeing the early signs of what may become more common in 2016.

To be sure, there are plenty of strong companies out there whose valuations are well justified. Companies like AtlassianHubspot, New Relic, Zendesk all have efficient business models and disciplined growth. 

But a lot of other wannabes have billion dollar private valuations that are much harder to justify. It could be that the market correction in tech is just an adjustment to valuations that were never warranted in the first place. There are lots of cool apps, devices and services out there, but it doesn't mean they are good businesses. If a company gets to $100 million in revenues and has no clear path to profitability, it's kind of a fool's errand. And many of these companies haven't even gotten that far. 

WSJ stock charts Jan 2016

So what does this mean for startups? Basically the lofty multiples of 2010-2012 are gone. The truly great companies will still command good multiples, but only if they are converging towards profitability. The "growth at all cost" land grab strategy that inspired young companies to burn millions or tens of millions of dollars per month isn't going to be viable in the current climate.

If you're in a company with less than 12 months cash burn, you better make sure management is reducing expenses. If you don't have an increasingly efficient growth story, this is going to be a tough time to raise money, so expect a down round. If you can weather the storm without having to raise additional capital and grow back into your valuation over time, that's not a bad way to operate. 

 

 


Silicon Valley Overheated?

Silicon valley lightbulb

Don't Worry - Winter Is Coming

There's been quite a bit of press recently about whether there's a tech bubble or not. Certainly, things are overheated in the valley. Traffic is out of control, competition for talent is fierce and there are definitely some companies with billion dollar valuations that seem, well, a little suspect. If HBO's series "Silicon Valley" is meant to be a satire of the worst in tech, it's remarkably close to the truth in some areas. I think the show is funny, but almost every farcical element in that show seems way too close for comfort.

One of the things newcomers to the industry forget is that, like most sectors, tech is a cyclical industry. There are boom times when stocks seem to just go up and there's unlimited demand for IPOs and then, well there's the opposite. That's what happened in the early 1980s, more severely in 2001 and again in 2008. There have been a couple minor corrections to sky-high SaaS valuations in 2014 and 2015, but nothing like we saw in earlier downturns.

The companies I work with all seem to be in good shape. They've got plenty of dry powder, having raised money in the last 12 months and generally are increasing their efficiency in terms of customer acquisition costs, cash position, etc. But for companies that have less than 12 months of runway, there could be problems. Like, a going-out-of-business problems. As is occasionally reported by new CEOs who run out of money, there's this tricky thing called "burn rate."  Which basically means, you should make more money than you spend. Really, it's not that complicated. 

When you build a company, you can't just optimize for growth at all costs. Otherwise, those costs can easily exceed what you're bringing in. And when the piper changes his tune and VCs and Wall Street decide to no longer value money losing companies quite as optimistically as those that are generating cash, it can lead to some pretty ugly situations. Just take a look at the downfall of GetSatisfaction, Fab, Zirtual or even Good Technology, which managed an exit, but at less than half it's billion dollar valuation.

GoodTechnology was around for almost 20 years and raised $290 million (!) over six rounds. The company talked about doing an IPO back in 2013. They filed for an IPO in 2014 and then amended it in 2015. Sadly, their growth started to decline while losses continued to mount. They lost $95m on revenues of $158m in revenue in 2014 leaving them just 7 months of runway in 2015.

Good-financials

Presumably Good couldn't raise more money and couldn't do an IPO in the current market. So they got acquired by Blackberry, a struggling company if there ever was one. While some of the investors and execs will have made money, I doubt the rank-and-file employees got much out of it.

It was probably lucky for Good that they didn't complete their IPO. While strong SaaS performers like Hubspot, Zendesk, New Relic have done well with steady growth in their revenues and share price, there's an increasing number of tech companies trading well below their IPO price including Alibaba, Apigee, Box, Castlight Health, Etsy, Twitter and others.

Bill Gurley from Benchmark Capital has been a particularly strong voice reminding startups that "Winter is coming"; valuations are being compressed and CEOs need to make sure they have a path to profitability.  Words to live (or die) by.


Customer Service Revolution?

I posted a guest editorial over at CRM Buyer called "Managing the Customer Service Revolution."  Here's a brief excerpt.

More recently, the social network boom has created a new revolution in customer service. The reach and immediacy of Twitter, Facebook and, now, Google+ has made the voice of the customer an extremely powerful force. Bad customer experiences can quickly snowball into online customer uprisings leading to PR disasters...

As is often the case, tech-savvy startups are the first to embrace new technologies and communication channels. Larger, more traditional organizations are now finding that they need to develop new customer service strategies or else smaller, more nimble organizations may leave them in the dust and take their customers with them.

Interestingly enough, we see a lot of open source and SaaS startups embracing the new model of customer service.  This includes rapidly growing companies like AirBnB, Box.net, CloudEra, CustomWare, DataStax, DropBox, GoodData, GroupOn, Hulu, New Relic, ScribD, Strobe, Twilio, Yammer, Zoosk, Zuora and thousands of others who are using Zendesk to deliver awesome customer service.  I think the reason for this is that companies whose products and technologies are disruptive will often chose disrupt tools in other areas also. 

Check out the full story at CRM Buyer.