Well, this is certainly going to be "duh"-obvious, but it's clear that we are entering the year of The Cloud. The rapid-fire succession of announcements from major players of one or another form of cloud offering is not just the result of flock mentality or boredom early in the year: it's a symptom of the relevance the cloud has reached.
Why now?
- A first reason is the current crisis. On each and every project, controlling cost has become the core preoccupation. And I really mean control as in exerting control over it, predicting it, managing it, not just containing it.
- A second reason is the availability of manageable technical solutions to support cloud-based platforms and applications. We are beyond cloud-supported data or document storage, we are now in the days of cloud-supported services and netbooks.
- And a third reason is the emergence of cloud providers with enough clout to allow significant early adopters to take the rational bet of overcoming the remaining complexities and confusions and start leveraging the offerings. Traditional platform vendors, major Web Commerce players, innovative newcomers.
The confluence of these three trends is - in my neophyte opinion - the key for the unavoidable success of The Cloud in 2009.
How will that happen?
- The economic drivers make The Cloud more relevant. Projects will focus on the management of cost, and The Cloud's inherent support for controllable scalability will make it the most attractive platform to work on.
- The innovative development community - including software vendors as well as system integrators - will flock to The Cloud, will support the development of the corresponding tooling (IDEs, platforms/middleware, ...) and will ensure the success of the various cloud-based or cloud-supported platforms (PaaS) on which service-based (SaaS) applications will be created and/or composed.
- The support for dedicated governance and management provided by cloud-based or cloud-supported platforms will prove to be very attractive to businesses having been burnt by previous attempts to create on-premise major SOA applications.
- Governance, life cycle management, etc, will become sources of differentiation between cloud offerings, and that will lead to significant innovation and momentum in that very key domain.
- Ubiquity and better governance and management will enable more customer/collaboration/social-centric (rather than process-centric) applications, shifting the focus from typical B2B/B2C to Business-to-Community (B2Comm?).
- Which will create further need for differentiation, better quality services, easier to assemble into high value evolving adaptive applications taking as part of their essence the various communities (users, analysts, customers, etc..)
- Development will be permanent, deployment frequent, adaption constant. New cultural points of view will be included in the applications.
- Etc.
All this, all this enabled by the fact that The Cloud makes the applications cost manageable. It becomes something that can be throttled. Just think about it: how can we compare the cost manageability of a major cloud-supported enterprise application versus a major J2EE enterprise application (if you want to suffer, say Websphere)?
What to be careful about?
Applications, in particular the "enterprise" applications I deal with, live and die through the quality of the decisions they support. An application can be very beautiful and execute very fast and in a secure way, but it fails to generate the business value that is expected, it's growth - or even it's survival - is questionnable.
Enterprise Decision Management (EDM) or Decision Management (DM) addresses that. In the pre-cloud days.
The challenge is now to think about how to approach EDM / DM for The Cloud.
Interesting times.
Thursday, January 8, 2009
Saturday, January 3, 2009
A little more on risk mismanagement
The NYT just published an interesting analysis of the role played by modern financial models in the current meltdown [http://www.nytimes.com/2009/01/04/magazine/04risk-t.html?_r=1&partner=permalink&exprod=permalink&pagewanted=all]. The article highlights the particular role played by VaR models and the institutionalized and improper reliance on that kind of models.
I have blogged about this in the past. It's common knowledge - although also commonly ignored - that mathematical models are just that - models. They operate under sets of assumptions that need to be understood to be applied.
But as pointed out earlier [http://architectguy.blogspot.com/2008/12/financial-instruments-systems-analysis.html], the key problem from a technical standpoint is that no real system analysis has been attempted on the complex combination of financial instruments being put to work. No exception or error propagation analysis, no interface consistency analysis, etc. These are all words that are foreign to the daily practice of these instruments. Quants are not system analysts.
To the industry's credit, it may well be that the sheer complexity that the systems analysis entails is such that it is practically unfeasible. If that ends up being the case, regulation-based restrictions on these instruments in the largest financial markets may help reign in the complexity - at the cost of reduced creativity. The current meltdown can explain how it is that such a move would probably be positive in the short to medium term - and would avoid having the tax payers foot a huge bill to bail out an already highly rewarded and ultimately irresponsible industry.
If the financial industry wants to avoid over-regulation, it will have to prove it can control its risk, through a combination of better overall systems analysis and better understanding of the decisions made.
Decision management is gradually including the relevant aspect of systems analysis, and will turn to be an unavoidable piece of the core processes that use the complex financial industries. You need to be prepared to deal with the unexpected and understand the impact of the decisions you make.
I have blogged about this in the past. It's common knowledge - although also commonly ignored - that mathematical models are just that - models. They operate under sets of assumptions that need to be understood to be applied.
But as pointed out earlier [http://architectguy.blogspot.com/2008/12/financial-instruments-systems-analysis.html], the key problem from a technical standpoint is that no real system analysis has been attempted on the complex combination of financial instruments being put to work. No exception or error propagation analysis, no interface consistency analysis, etc. These are all words that are foreign to the daily practice of these instruments. Quants are not system analysts.
To the industry's credit, it may well be that the sheer complexity that the systems analysis entails is such that it is practically unfeasible. If that ends up being the case, regulation-based restrictions on these instruments in the largest financial markets may help reign in the complexity - at the cost of reduced creativity. The current meltdown can explain how it is that such a move would probably be positive in the short to medium term - and would avoid having the tax payers foot a huge bill to bail out an already highly rewarded and ultimately irresponsible industry.
If the financial industry wants to avoid over-regulation, it will have to prove it can control its risk, through a combination of better overall systems analysis and better understanding of the decisions made.
Decision management is gradually including the relevant aspect of systems analysis, and will turn to be an unavoidable piece of the core processes that use the complex financial industries. You need to be prepared to deal with the unexpected and understand the impact of the decisions you make.
Labels:
edm,
enterprise
Subscribe to:
Posts (Atom)