You know what they say about being schizophrenic – the upside is that you always have someone to talk to. Well, I’m going to enjoy my budding schizophrenia here and attempt to write a counterpoint to my own blarticle about Fusion, “I Pity the Fool”. If you liked the first one, you might enjoy this alternative take on Oracle's Fusion platform...
When I started my writing career at the tender age of 9, I thought my first major work “The Crystal Mirror” would be my opus. To my surprise, it was not my freshman novella, but my “I Pity the Fool” blarticle that has received more comments, blog references, and trackbacks than anything else I have ever written. For those who have not read it, IPTF was a tongue-in-cheek recounting of Oracle’s January 18th “Halfway to Fusion” event. Using an homage of the 1980’s A-Team television series, the blarticle focused on some of the very real technical challenges that the path to Fusion might present for Oracle customers.
Even today as comments still trickle in, I am amazed at the breadth of opinions I have received. To date, no one has yet to openly debate the A-Team analogy or the finer points I made in respect to the challenges of Oracle’s technology path. I suppose on both accounts that should be construed as a good thing. It has, however, been fascinating to hear how many Oracle customers see the path to Fusion as a less daunting – or even welcomed - journey than I initially assumed would. Setting aside the few customers who are already loading up their Fusion sherpas and heading off to base camp 1 at first sign of good weather, many other customers said their organizations are staying unmoved in their commitment to Fusion regardless of the technical challenges (that’s unmoved in the “we’re still committed to Fusion” sense and not the “we’re not budging from PeopleSoft” sense). Regardless of each individual opinion, everyone who wrote seemed to think the “Halfway to Fusion” event was a significant milestone. For many, this was less a time to reflect on Oracle’s Fusion milestones and more a time to consider their own internal progress towards addressing the big pink elephant sitting quietly in the corner of their IT department, “What should our IT strategy be for the next decade?”
In an ode to Charles Phillips, before I advance more deeply into my Fusion counterpoint, let me first dispel a few myths about IPTF that have surfaced in the many postings and blog references to the blarticle.
IPTF Myth #1: Fusion is a Bad Idea
Saying something will be challenging does not inherently mean it is a bad idea. As proof, I once tried to eat 10 saltine crackers in 60 seconds on a dare. Now that was challenging (and in that case may have actually been a bad idea). But nowhere in IPTF, or any other blarticle I have written, have I said Fusion (or moving to it at some point in time) is definitely not worth its challenges. In fact, in a number of blarticles I have given the folks at Oracle credit for placing an extremely aggressive bet on the future of IT and for trying to emerge as one of the three most relevant players in the software industry (extra credit if you can name the other two). If you think I’m quoting revisionist history here, read (or re-read) my blarticle on the Emerging Application Operating System or my most recent blarticle on Oracle and SAP’s emerging battle with Microsoft over the hearts and minds of developers.
A lot of readers also thought that I was picking on Oracle,
framing them as the big bad company that pulled the wool over everyone’s eyes.
While the article was definitely tongue-in-cheek (when you live this enterprise
software stuff every day, you have to find a way to lighten it up), I actually
have pretty much the opposite opinion. It is rare that a seismic shift occurs
in the software industry. Most of the tremors we feel turn out to do little
more than move our fine
I promise to go into this idea in a lot more depth below, but let it be said that it’s not every company that survives such seismic events in their market. There is a litter of multibillion dollar technology companies that for one paradigm shift or another completely lost their footing and were never able to recover. Take for example Wang, DEC, Digital General, SGI, Novell (recently clawing back with a new open source strategy), Baan, and arguably loosing their grip on some markets now, Microsoft.
When you’re a titanic, it’s pretty hard to turn away from an iceberg if you don’t see it coming. But both Oracle and SAP sounded the alarm long before most other companies even knew they were in icy waters. In fact, both SAP and Oracle not only saw the shift coming but have taken control of the reigns (albeit some reigns cost more than others) and are pretty much driving the shift themselves now.
IPTF Myth #2: SAP is definitely going to win the Application Operating System battle
If you read IPTF again, you’ll see that I attempted at many points to be balanced in my analysis of the challenges of Fusion and point out that SAP is facing almost the exact same issues. If you want further proof, look at the two application operating system stacks for both companies – they are almost exactly the same:
The subtle difference is that Oracle and SAP are fighting different battles in the hopes of winning the same war. SAP is battling change from within their walls. At the same time they are trying to convert themselves and their customers into nimble companies, they need to convince the world that they are really a core technology company as well. SAP is making some progress against this goal with the application stack of NetWeaver and Web Dyn Pro, their new development environment. There is some serious technology in there. Also consider their recent moves with SAP OnDemand. They announced a truly unique approach to on demand services, allowing their customers to flip back and forth between a hosted model and a custom installed solution on a moments notice. That’s definitely an attempt to bark up the nimble tree.
Contrast this with Oracle. No one will argue that they are not a technology leader. However, their own battle is how to stuff all their bags into one car after their recent $16 Billion shopping spree. It has left them with a large and powerful customer base but a highly fragmented technology problem. The good thing is that Oracle has experience converting customers from one technology set to another (insert standard reference to RDB here). Oracle’s battle only appears more challenging because it is more visible. In the end, what matters for comparison is that they both have the same problem of convincing approximately 25,000 customers using old technology to walk towards the new technology light.
IPTF Myth #3: Oracle doesn’t understand the reality of their customers’ situations
If you want to understand how much Oracle understands about their
customers, just listen to Thomas Kurian talk about Fusion Middleware.
If we were playing technology kickball, I’d pick this guy first for my team. One
of the most noticeable aspects of Oracle’s approach to Fusion is their intense effort
to talk to customers. I know this because Newmerix also
talks to their customers all the time. We go to RUGs (Regional User Group
meetings) and see Oracle executives appearing on a regular basis. We know some
of the folks at Quest International (the
company that used to manage J.D. Edwards user groups and now manages many
PeopleSoft RUGs and Oracle events). Oracle executives are standard fixtures at Quest
events. And not just 3rd, 4th, or 5th level
executives. You get the real deal: Wookey, Kurian and Phillips at pretty much
every one of them. In my 10+ years of being involved in IT, I can honestly say
that I have never seen a company the size of Oracle expend so much energy talking
to their customer base.
IPTF Myth #4: Oracle can’t pull Fusion off
For anyone who actually watched the A-Team you’ll know that they always pulled it off. There are definitely moments in any episode when you think they may actually not make it out alive (okay, I was like 10 years old when I was watching the A-Team so I actually got nervous there for a second) – but they always did. That’s half the reason I picked the analogy for the blarticle. They find a way to pull it all together and prevail. Now, don’t get me wrong, prevail does not mean break up Microsoft, put SAP out of business or end up building 15 more database-shaped buildings in Redwood Shores. It simply means they’ll get to where they want to be: one of the top 3 most relevant software companies in the world.
IPTF Myth #5: The press coverage on the event was poor
Okay – this is not a myth. I do think that is true.
Burning the Fusion Candle at Both Ends
As the comments roll in, I have spent a lot of time thinking about the other side of the Fusion coin. Here’s the basic question I keep asking myself, “Why would an Oracle customer want to move to Fusion?” After a while, it dawned on me that to answer this question you have to take both a micro and macro perspective and find the right problem that needs solving. Arguably IPTF is a micro perspective – delving deep into the leaves of a packaged application development team’s daily experiences. At that level, Fusion looks like an attempt to eat spaghetti with only a spoon. But to truly understand why a move to Fusion might not be a bad idea (or NetWeaver to attempt to be fair here again), you have to back way the heck up and look at the general state of IT. In some respects, while IT departments have made incredible progress over the last 8-10 years, they've also backed themselves into a bit of an IT cul-de-sac.
To properly explain what I mean by this, I’m going to come at it from two directions. Let me start top down by framing what I believe to be the problem with the general state of IT. While this sounds a bit like rationalizing a winter coat purchase through a global warming argument, it’s useful to understand some pretty major shifts that are occurring in global IT departments and how Fusion fits into that picture. After that, I’ll go bottom up and talk Fusion technology and how it might just alleviate some of the most annoying aspects of our daily IT lives.
Why Nicholas Carr is not a Prospect for Fusion
I recently read the now outdated, but still infamous, Nicolas Carr HBR curmudgeon-fest article, “IT Doesn’t Matter”. It should not be a surprise to find me joining the ranks of pretty much everyone else in Silicon Valley in saying that I think Mr. Carr basically missed the side of the barn with that one. However, the reason I think he missed the point is also the reason why I think people should be paying attention to Fusion.
Generally speaking – I believe that companies advance their competitive position in only one of three ways. Usually, in any given market, only one of these types of innovations occurs at any given time. In addition, the innovation cycle is repetitive and generally hits a market in relatively consistent waves. In the IT world, these waves lap against the budgetary rocks about every 7 to 10 years in an eerily consistent fashion. This creates a leapfrogging opportunity for competitive gain, where we see a small set of companies shoot out ahead of the others for a while, accumulate some market advantage, knock off a few of the weaker players, and then watch the general market approach an equilibrium again. Here are the three types of innovation I am talking about:
This includes companies that gain competitive advantage through primary research (e.g. chemistry, material science, and in some cases simply better physical design). These would be the Intels, Apples, Duponts and Pfizers of the world.
This would include companies that gain competitive advantage or positioning through legal acumen or exploitation of regulatory change. Good example would be CLECs, satellite radio, EchoStar, SCO (if successful), or patent-trolling companies.
I firmly believe this case describes the competitive battlefront of most companies in the world. Companies focused on driving more efficiency into both their internal and external processes can (and do) win a competitive advantage against their competitors. And almost always, IT is at the heart of this discussion.
Setting aside physical innovation and legal innovation for the time being (as we’re talking about Fusion and IT here), I believe there are two ways that companies can gain competitive advantage by driving process efficiency. First, they can reduce their internal cost to deliver products and services to the consumer by increasing the efficiency of their internal delivery processes. This allows them to either gain competitive advantage through lower prices, or gain advantage by offering new features while maintaining the same competitive pricing. Second, companies can win customers by reducing the inefficiencies that they experience when buying or using products and services in their market.
Let’s take the travel industry as an example. Do you remember how much of pain it was 10 years ago to call a travel agent - or god forbid call every airline - to try and find the cheapest fare from Pensacola to Duluth? There were huge inefficiencies for the consumer as well as massive inefficiencies for the airlines (reservations systems, manned airport counters, lost baggage management, customer service). By leveraging technology, Expedia and Travelocity completely decimated the travel agent industry in about 5 years. They gained a massive competitive advantage over the individual airline web sites and it took a number of years for Orbitz to emerge and the industry to regain a relatively stable equilibrium. Referring back to Carr’s article, even some of the companies that Carr lauds as having the lowest “percentage of revenue spent on IT” versus “returns” were successful primarily for their commitment to process efficiency. Dell (one of his examples) is nothing, if not the poster child, for gaining competitive advantage (Gateway, CompaQ, etc..) by vastly changing the process of buying a custom configured PC as an end user.
My point is that Carr looks at IT as hardware, storage and networks (e.g. utility computing infrastructure) and not as process-focused applications. Applications are the real gem of IT and competitive advantage is won by those that can roll out applications that reduce internal and external process efficiency as quickly as possible. I’d argue that the last major wave of IT innovation that allowed early adopters to increase the speed with which they rolled out process efficiency application was the advent of packaged applications. In looking at the promise of Fusion, I’d say that we’re just starting to see the next major wave swelling. Companies that reconfigure their IT departments to take advantage of this wave will have a chance to become breakout leaders for the next few years and gain significant competitive advantage. In a 60,000 mile view, this is the real promise of Fusion and the Emerging Application Operating system.
A Brief History of IT Waves
Consider the last 20 years of application development in most global companies. In the days of independent PCs, some IT shops leveraged point applications to increase process efficiency. Spreadsheets, desktop publishing, POS applications – they all sped up simple business processes. While this revolutionized many processes, the problem with this approach was that it was isolated. In most cases, important business processes span departments and people. In these situations, it is the data, not the application (as different departments will access data through different types of applications) that needs to be shared across the process. With desktop point applications, data sharing was left up to the sneaker net and cross-departmental processes were left to run on paper and carbon copies.
The next technology wave to approach companies was the client/server paradigm. In one fell swoop, client/server completely change the way that companies could use technology to automate their business processes across their organizations. At the same time, the modern IT department was born. For the first time, IT departments could focus on building applications that shared data throughout a business process, across different applications and systems and eventually users.
The third wave was clearly the move to Internet-based computing. As companies abandoned proprietary networking protocols in favor for standards based approaches they found themselves being able to offer applications directly to their customers and prospects. A whole new era of process efficiency innovation swept the global 2000. Those who integrated web-based customer applications directly into their core IT infrastructure took an early lead in the end of 90s and early years of this millennium. Internet-based self-service applications dramatically reduced the cost of call centers, counter personal, customer service and customer and prospect buying experiences. As time has shown us, there was a litter of breakout successes and painful failures related to the last seven (or so) years of breakneck innovation.
But there has been a mounting problem growing behind the
scenes. All this time, through the desktop, client/server, and eventually
internet computing waves, company’s IT departments were growing bigger and
bigger. Large companies who’s products had nothing to do with writing software
found themselves making significant investments in their own internal software
development teams. The expanding reliance on business applications for process
efficiency meant the constant increase in IT department budgets to support
these applications, build new features, and integrate with partners. The future
was looking cautiously expensive and was begging the question – how much should
software development be a core competency of my organization?
When packaged applications came along, they brought with them a new promise – the automation of all your internal and external business processes without any of the hassles of writing and maintaining your own code. The adoption of packaged applications promised to reduce the staggering dollars that IT departments were pumping into software development, while allowing companies to continue innovating efficiency into their business processes. It was not until the Y2K crisis that organizations were really faced with the downside of having written their own software in-house. For many, abandoning their existing custom application code base and starting afresh on a packaged application was a cost-effective alternative (short and long term) to ripping apart their COBOL code and looking for all the date fields.
While unbelievable progress has been made using packaged applications as the basis for application expansion, it is becoming evident that IT department costs are still rising and packaged applications have not been the silver bullet everyone hoped they would be. One of the main problems is that companies have continued to develop some business applications themselves while they have built others on packaged application frameworks. Many companies even have multiple packaged applications (the classic case is having one vendor for ERP and another for CRM). So IT departments are left with a landscape that is still costly to maintain, has applications running on different infrastructures, and smacks of a distinct lack of nimbleness in their ability to deliver the next wave of efficiency increasing applications.
And the truth is that businesses are not staying stagnant no matter how their applications are built. Business processes change, partners change, regulations have been introduced that specify how some process must be followed, and most industries are going through some form of consolidation or break-up. Every small change to the business means a change to the applications. Clearly a better solution is needed in order to get ahead or even just keep up.
SOA Matters, But Not for the Reason Most People Think
Kevin Sauer, one of our developers at Newmerix, sent me a link to a great interview between Channel 9 and Bill Gates. While most of interview was standard “we still have roads yet to be taken” stuff, Bill articulated a very good example of our current problems with IT. Bill’s example is simple and powerful and taken from the banking industry. He asks, “How many lines of code do you think are different between two banks’ basic processes?” His implication is that there is huge number of different lines of code, probably the same as the number of Google millionaires created in the last couple of years. But Bill goes on to ask a poignant follow up, “At a high level, how many real differences are there between the processes of authorizing a loan at one bank or another?” Well – not very many – probably the same as the number of Microsoft millionaires generated in the last couple of years.
His point is that when you slice and dice business processes into small activities, you start to see some pretty dramatic similarities across the way different businesses get things done. As I mentioned in IPTF, it is unrealistic to believe everyone will agree how to run their business processes the same way and hence each company’s processes need customization. But imagine for a second that when we talk about customizations, we talk about them at the process level and not the code level. All of a sudden, the customization discussion starts to sound bounded and manageable.
This is where the benefits of a process-centric approach to
IT start to make sense. At the heart of it, Fusion is really just Oracle’s push
towards a process-centric application creation and management platform. The
general idea of Fusion is the following. Install Fusion and you get a basic
framework for managing business process. Oracle will get you started by giving
you the most important building blocks of standard processes most companies
run. When you find that one of their vanilla process components doesn’t fit
your business, you can choose to customize. But in a
process-centric framework, you only need to customize the parts of the process
that your business performs differently. Using the composite application
approach of Fusion (a generally modular approach to architecting business
applications through a SOA foundation), you can segment out your customizations
from the rest of the application. In
fact, because of the modularity of composite applications, you might even be
able to buy a third party component that gets you the customizations you require.
If you need credit checking facilities for a certain vendor, no problem, there
may be a composite application that snaps right into your Fusion process to do just
this. Sew it all together with BPEL (Business Process Execution Language) as a
process execution manager and you’ve got a much simpler approach to swapping
out the credit checking facility with a third party application or one that you
have built yourself.
The Problem with Knowing Who to Call
When Ed and I were writing the original business plan of Newmerix, one of the great things about focusing on the packaged application market is that we knew who we’d be selling to. If you ever stumble across a good business idea where you can absolutely identify the person who can recommend, propose, and budget a purchase of your software, stop whatever you are doing and start the company. Having an unclear buyer and needing to dig around inside of a company (or IT department) to find the right person is one of the few consistent things that failed startups can point to at the heart of their demise. Let me give you a contrasting example.
An alternative business plan I had on the back burner when we were considering starting Newmerix was to build a J2EE management company. At that time (2002), J2EE was just coming into its maturity phase in the enterprise development world. Hordes of developers were moving over to BEA, WebSphere, iPlanet, etc.. and starting to develop their applications on these platforms. And at that time, there were not a lot of tools to help manage this development. Since then, many companies have formed to help out with the problem. Sitraka got bought by Quest, Parasoft is a pretty big force, MERQ was moving into J2EE testing and performance management, and the Wily’s (now CA) of the world were starting to get some traction. The thing about the J2EE market that turned me off was that it was very hard to identify who you actually should call to pitch your product. Do I call the Director of IT? The Application Manager for a project? The developer themselves? While Rational showed us you can build a very good business selling pretty much directly to developers, it’s hard to turn these sales into IT wide purchases. In contrast, Newmerix’ job is easy – call the Director of PeopleSoft. They live the problems of managing change in PeopleSoft on a daily basis and 99% of the time they have a technical background so can understand our proposed solutions. And many times, gasp!, they even control the budgeting process for ongoing maintenance and tools.
In relation to Fusion, the other side of this example is that it tells us how most organizations manage their packaged applications – as independent silos from the rest of development. A Director of PeopleSoft usually manages a completely contained PeopleSoft team including developers, DBAs, testers, maybe a business analyst or two, and a migration expert. It’s not until the final code gets rolled into the production system that most PeopleSoft teams hand anything off to the general IT department (most companies appear to let general IT handle production systems). Great for Newmerix, bad for operation costs in your IT department. The reason for this is simple – PeopleSoft, or J.D. Edwards, or Siebel or whatever are all highly specialized applications. You can’t take a custom application developer and have them jimmy something up in PeopleSoft. And you generally can’t ask a custom application tester to test PeopleSoft. They don’t know the first thing about the business processes PeopleSoft is designed for and thus have no idea what might be the right test cases to run. Hence the naturally insular approach to building and managing a PeopleSoft team.
To make matters worse, because PeopleSoft is a specialized arena, the cost of hiring full time employees, augmenting staff with consultants, or even outsourcing completely to a SI has a premium associated with it. A good, but not stellar PeopleSoft developer/migrator can cost you upwards of $100,000-$120,000 a year easy (depending on location). SAP consultants ring in at 150 bucks an hour. Compare that to a Java developer, who might cost you significantly less. Java (and C# and whatnot) are more widely available skill sets, so the general market price of these team members is lower. For all those Java developers ordering PeopleCode and ABAP books off of Amazon as you read this, keep in mind that the market for PeopleSoft and SAP developers is much smaller than the market for general application developers (so the grass is not as green as it may look).
Now take this cost problem and multiply it in two dimensions: suites and ISVs. Many companies actually have different groups to run each suite of PeopleSoft they buy (a rough equivalent might be J.D. Edwards customers running different groups for E1 and World). So your IT department has this specialization problem replicated once or twice or maybe more. Now consider what other packaged applications you might be running – Siebel, SAP, etc.. and you start to realize you have very high cost silo’s of knowledge in your organization that run completely independently of all other development activities.
The composite application concept in Fusion fundamentally aims to lower the cost of developing applications by getting everyone back onto the same infrastructure page. All those java developers you have now might be able to add some value to your Fusion team. All those Java management, testing, and profiling tools you bought will all of a sudden be usable when dealing with your ERP and CRM apps. While there is a cost associated with retooling a specialized workforce to do anything (yes, I picked on this very issue in IPTF), the long term cost benefits can be dramatic. Go even further now and include the monitoring tools, performance specialists, and middleware specialists you already have in your organization. Rather than have a specialized team keep nudging each application along you can have a large number of J2EE application stack specialist who can make all your applications hum. If it happens to be Fusion, then so be it.
Adding the Buy into Build or Buy
In the current packaged application world you have two choices when you want new features – wait for PeopleSoft to deliver something or build (customize) it yourself. Having developed many custom applications over the last 10 years, I have learned that the first thing you do when scoping a project is try and find third party widgets to take care of non-core functionality. If you can buy these widgets royalty free and slot them into your application, you can dramatically reduce the amount of time your developers spend on functions unrelated to real customer value. Consider again the packaged application world. How many vendors out there develop modules for PeopleSoft? How many for J.D. Edwards? There are perhaps a few (Vertex rings a bell), but in general you’re not going to find the widget you are looking for. As we move into a composite application world, the marketplace for developers to build composite application widgets on top of Fusion widens up dramatically. SAPs early introduction of xApps shows the potential for 3rd party providers to extend the platform’s value. Oracle can't be far behind.
Contrast this to the conundrum that PeopleSoft customer’s faces on a regular basis, “I need some new functionality. PeopleSoft says it is on their roadmap, but not for 18 months. I can only get it from them (and then maybe will need to customize it when it arrives anyway), or I can build it myself.” Well, depending on their urgency, a lot of customers will take the later path and in the process dig themselves deeper into their customization hole. Inevitably when PeopleSoft does deliver the feature as part of the core module, the customer will look at the effort to maintain their customizations versus adopt and customize PeopleSoft’s feature. Many end up sticking with what they built themselves. Imagine now if customers had a multiple choice answer: get it from Oracle, build it myself, or buy it from one of 6 vendors developing components for the Fusion ecosystem. This is incredibly good news when considered through the lens of speed to delivery. 9 times out of 10 I’d guess that most companies will pick something already developed, and because of Fusion’s open model, they will still be able to customize the last 5% they needed rather than write 100% of it.
If you still doubt that this reality might come to fruition, take a look at some of the early success SalesForce.com in having with AppExchange. For all intents and purposes these guys are forging a vision that will inevitably be replicated on the Fusion and NetWeaver platforms as well. In addition, Microsoft already has an amazing array of prebuilt business components that you can plug right into SharePoint (which I think will be their Fusion equivalent when Dynamics arrives). The moral of the story is that applications to increase process efficiency will get easier and cheaper to build and maintain with the Fusion platform.
One additional piece of credit I have to give to Oracle is for their choice to continue supporting BEA in the Fusion stack. While people may believe this is solely an attempt to solidify their position that Fusion is standards-based and pluggable, part of the real value of keeping BEA in the stable is that it starts the Fusion platform off with an immense amount of 3rd party components. In essence, by the time Fusion arrives in all its glory, there will be a full composite component ecosystem ready to be consumed by you, the user.
Eskimos have 30 Words for Snow, Nestle has 29 for Vanilla
In my preamble about IT, you might remember that I said there is one critical element to building applications that drive efficiency into business processes. That critical element is data. Clearly one of the biggest requirements of IT applications today is to have access to the same data across the business process. But there is another emerging requirement around access to core data. Businesses need information about where they are being inefficient. It’s part and parcel of understanding my “competitive advantage through process efficiency” theory. Businesses need to understand which processes are the least efficient so they can focus their resources to regain a competitive edge.
Well informed businesses run better than those who have little or no information. Business winners find a way to be proactive and not have to react to their customers or the market in real-time. The biggest IT challenge in helping businesses accomplish this goal is the problem that data is sitting in many different database silos. You have customer data in your CRM system, transaction data in your financial system and materials information in your ERP system. Today, it’s nearly impossible to get a consolidated view of what is going on without pulling all of this information together and trying to normalize all the idiosyncrasies of how each system stores and names things. A well known example of this problem is Nestle – who after years of acquisitions finally realized they were paying 29 different prices for vanilla because the data was lying around in so many different places.
One of the key promises of Fusion, and one of the ways that Oracle is recommending many customers get started on the Fusion path, is to consolidate data into a common repository. SAP actually has a very similar strategy. They have been pitching their MDM (master data management) philosophy hard ever since SAPPHIRE ‘05. By consolidating data across your applications into one common Fusion database, you can get a normalized view of what’s going on across departments. These cross-departmental views let you understand which processes are the least efficient and essentially helps you build a project plan for where to spend your time with Fusion.
Total Cost of Ownership
When SAP first launched their NetWeaver marketing campaign, they made TCO a centerpiece of their message. The idea was simple. You run your mission critical apps on SAP and most of your other apps on traditional middleware. Why not just move them all onto one application platform and reap the benefits of this consolidation. While the market has focused on other issues since the early days of NetWeaver, I am a little disappointed that this message has been lost in the clutter of competition and hype. At the end of the day, this is absolutely the most important value proposition behind getting to Fusion as quickly as possible. Let’s go through all the basic costs involved in managing a packaged application and in managing packaged applications separate from the rest of development:
The Cost of Customizations
Ahh, my old saw: customizations. One of the areas where I pressed Oracle pretty hard was on the issue of moving existing customizations to the Fusion platform. While I am still not moving of my assertion that it is pretty unrealistic to think that customizations can be migrated in any kind of automated way, there is another side of the customization story to tell.
Customization is probably the most expensive aspect of ongoing maintenance of a packaged application. Every time a patch, bundle, service pack, family pack, package or support pack (depending on your vendor of choice) is put into your system, you have to go through a very laborious process of rationalizing the vendor’s changes with your customizations. Because packaged application architectures today are relatively monolithic around any given suite (e.g. HRMS), you have to go through each customization that might be affected and decide to take Oracle’s new code, leave your own, or merge the two together. This happens across business logic (PeopleCode, J.D. Edwards BFL, Siebel VB Script, SAP's ABAP) as well as most of the reporting infrastructure that is intimately tied into customizations.
For companies that have been slowly adding customizations over time, this is an incredibly laborious and expensive process. And it grows non-linearly in terms of the affect on your organizations agility. Many packaged application owners have painted themselves into a corner with customizations and there are some dire consequences. Many customization-heavy organizations find they have to forgo valuable new features delivered by the vendor because the colossal effort of reintegration of undocumented customization code from 5 years ago is daunting. Second, the ability to quickly put in a needed patch (whether it be to core functionality or a security fix) is hampered. I have spoken at many PeopleSoft RUGs and always ask the audience one question, “Raise your hand if you’re up to date on patches, bundles and service packs”. Usually you hear crickets, and then maybe one or two hands slowly rise up, the owner staring at their shoes knowing basically that no one will really believe them.
To make things worse, customizations have a lifecycle of their own. Anything you develop yourself will eventually need to evolve due to user or business demands. Users will ask for new features and if they need to happen in a customized area of the application, you’ll end up having to extend these customizations, your hole becoming deeper along the way. The ability to get your users what they need slows down directly in proportion to how customized you are. For many, an upgrade to Fusion will be a long needed refactoring of a long legacy of customizations.
Let me use an example from Newmerix (in the custom world). For a while we have had a product called Automate!Program Manager. A!PM was built pretty much from the ground up using ASP.NET and C# as the technology framework. While we delivered the business-related features the users wanted, we found ourselves, as with all products, with a laundry list of customer requests. When we looked at the request list, we found that a disproportionate number of these requests had nothing to do with business features. Most had to do with basic platform features. We found ourselves delivering release after release of things like table customization and report formatting. While valuable to the usability of the product, none of these releases advanced the core functionality of the product. Finally we decided to launch a major effort to replatform A!PM and go through a complete refactoring cycle. We ended up picking SharePoint as the base platform (I have much to say about this decision and will write about this journey in an upcoming blarticle). The primary reason we decided to rebuild on a platform like SharePoint was because a vast majority of our A!PM requests related to usability were covered by the underlying SharePoint platform. All of a sudden we found ourselves planning future releases based around core functionality instead of usability features like horizontal scrolling capability in our grid control. We’re back on track because we essentially cleaned out our customized code base. Yes, of course, we have had to reimplement some of how we deliver core features because of the way SharePoint works, but the net-net value of this move has been incredible.
Moving to Fusion is going to give you a chance to do this with your customization code base. In addition, the SOA model of Fusion provides the customer a whole new way to avoid the pitfalls of customization in the future. By modularizing your code base through the Fusion framework, you can sandbox components that are customized from those that are delivered by Oracle. When the time comes to adopt new ISVs features to replace your customization, only a small number of components need to be changed and retested. This is a huge strategic advantage that serves both the reality that customizations are a critical necessity of any packaged application platform as well as the vendor’s desire to have you adopt new features in as timely a manner as possible. Oddly enough, this is already a major difference between PeopleSoft’s and SAP’s architecture. SAP lets the user create “user exits” in SAP which essentially override SAP-delivered code. This architecture has made it a lot simpler to organize and modularize code changes. While there is still a integration process whenever SAP delivers something new, its much less painful than PeopleSoft’s approach to putting customizations directly into the code that they deliver (of course keeping a vanilla copy on DMO).
The Customization Tail Wags the Process Dog
Another detraction of customizations is that you end up taking suboptimal approaches to the way you do business because customized applications are expensive to change. One of Oracle’s better arguments against keeping customizations is made by Lenley Hensarling (Vice President Application Product Strategy). His basic premise is this: sometimes it’s easier to revamp your business process than to keep pushing a square customization peg into a round process hole.
Humans have two basic qualities, incredible adaptability and short memories. Many times when a new business process is needed, IT departments will try their best to force fit a solution into their existing applications. Many times, technology or historical constraints roadblock the best possible solution and users end up with less than perfect (or efficient) processes.
After an initial period of griping, most humans forget the technology workarounds or 10 extra manual steps they have to perform and just get on with it, slowly losing sight of the fact that they are being grossly inefficient in how they get things done. In these cases, technology has taken over and is the tail wagging the process dog.
One of the benefits of moving to Fusion, is that it will force a critical look at each of your business processes. You get to ask yourself “are we really being that efficient given the constraints of our technology and how we have customized things”. For many, they will find a direct correlation between these inefficient processes and the legacy customizations in their applications. The problem is that while customizations are usually a handy finger in a bursting dam, they sometimes move a process out of the arena where it can participate fully in the value of the original system delivered by the ISV. Redesigning a business process on Fusion can move you away from many of these customizations, most of which you will never notice are gone once you have a better process in place. In many ways, this is a refactoring at the process level as opposed to the customization level.
The Cost of Upgrades
One of the biggest nightmares of packaged application ownership is the process of patching and upgrading your applications. Primarily because of customizations, the whole upgrade process grinds to a snails pace as you figure out what stays and what goes. The Fusion platform’s SOA promise, if you do some up front planning, is to make this much simpler for you. Past the customization problem, part of the challenge is that current ISVs architectures make patching or upgrading just one piece of their application very hard to do. You always run the risk of affecting a related part of the application and having to retest the whole thing. If you just want to get some patches and fixes to one module in a suite (say eBenefits in HRMS), you’re stuck testing a lot of the application because you don’t know what other parts of it are affected by the centralized metadata model which you might have changed. This makes incremental adoption of new features or patches a royal pain in the ass. As is always my sidenote here, we started Newmerix because of how complicated (and frequent) the process is. While I don’t expect Fusion to completely remove this warning from the side of the packaged application box, the concept of being able to upgrade just one piece of the application (e.g. a SOA web service) is quite alluring.
The Cost of Not Having Standards
One of the biggest arguments in the TCO pitch is the move to open standards. Standards don’t necessarily mean free, but they mean interoperable. If you already have licenses for some piece of the application stack, plug that piece in and save yourself the cost of buying something new.
Today this is already happening in the portal world.
Standards like JSR168 and WSRP are allowing customers to retain their
investment in BEA, Plumtree, Oracle Enterprise portal, etc.. and navigate them
into the Fusion docks. In addition, all the core application server platforms
BPEL is critical to the new SOA-based AOS architecture. It started out as a fabulous attempt to evolve workflow management from a document centric activity to a process-centric activity. The first pass collapsed hordes of proprietary workflow standards that had been swirling around for years. Using existing web services standards as a base, the BPEL 1.0 spec was spit out by the Oasis working group. Only 2 years later though, we have BPEL 1.0, BPEL 1.1, WS-BPEL and two totally different models of how BPEL processes are executed: orchestration or choreography. Already the core standards of the application operating system are eroding. It’s my great desire that the folks at Oracle, SAP and Microsoft (who seems to already be going their own BPM path with Windows Workflow Foundation), try their hardest to keep at least some basic standard of support. Without it, customers will slowly be forced away from choice and back to a single vendor application stack. While I’m not ringing the death nell yet, I am concerned about the direction things are heading.
Agnosticism in Limited Bites
With those TCO arguments in hand, there is one last note I need to make about the path to Fusion. One of my speculations in IPTF was whether or not Oracle would support non-Oracle databases on Fusion platform. This is an important question as across Oracle’s four packaged application platforms, there is a ton of DB2, some SQL Server, and a smattering of Sybase and Informix still in the customer base. In talking to some of the folks at Quest International Users Group the other day, they informed that Oracle had essentially committed to maintaining DB2 as a viable platform choice in Fusion. “It had to be official,” they joked with me, “they gave us all pens.” This should be heralded as encouraging news for a lot of the customer base who have standardized on DB2. And in more recent news, I can only imagine that Oracle’s purchase of SleepyCat may well be the third leg of their database stool. SAP is showing increased support for open source databases as part of the NetWeaver platform through their support of MySQL. While I am dubious that Oracle is willing to give up a chance at database license revenue underneath Fusion, at least it appears there will be an open-source alternative if customers demand it.
Back to the Beginning
No one, Oracle included, has said getting to Fusion was going to be easy. Neither is planning the next decade of your IT department. In reality, both of these will be a many stepped process over a number of years. Going back to the initial goal of speeding up application delivery so you can continue to gain competitive advantage with efficient business processes, customers really only have two places to start: normalize data or normalize development. When Oracle says “get to Fusion middleware” and “get your developers onto Fusion tools”, what they are really saying is start to standardize how you do things in IT and prepare for the coming shift. Imagine the efficiencies you can gain with one consolidated development platform for both custom and packaged application development. This is the vision of composite applications, SOA, and inevitably Fusion or NetWeaver. And it will have real, measurable, short term ROI.
But after that, how you proceed from there is still a subject of much debate in the Oracle community. Some customers will simply wait until all their existing PeopleSoft applications are available on Fusion. Others will adopt the SOA mantra and integrate PeopleSoft, Fusion applications, and custom extensions one piece at a time. In any regard, Fusion does give you a roadmap to incrementally (through the magic of SOA, BPEL, etc..) get from here to there. While there are some pretty nasty speed bumps on this incremental path due to customizations, reporting, and skill sets conversions, the destination may end up being worth the effort of the journey.
Note: If you want to Subscribe to this feed, click here.