The posts are from the desk of our CEO. His thoughts on entrepreneurship, leadership, and business development

TCO – Back to Basics

Total Cost of Ownership (TCO) has been around for decades as the standard litmus test to calculate direct and indirect costs associated with the purchase of IT, but through time-worn iteration, it has morphed in a BIG way.

TCO-Back to Basics

The dynamic change of TCO itself is in large part caused by the spiraling amount of pieces that make up what defines cost itself. However, while that is fine and good, I have observed antithetical usage of TCO in daily conversation which is in stark contrast to the over-arching intention of the acronym.

In this day and age of shrinking budgets, higher accountability and mind-boggling competition, the call signs surrounding daily dialogue with and within enterprises resonate loud and clear, and ubiquitously at that:

“Can you cut my costs?”

“Can you save me money, time, resources…now?”

“Can you give me more X ,Y, and Z  for less?”

“Can you get it done faster, cheaper and with better support?”

In my business, my de facto answer is ‘yes’ to all of the aforementioned, not because I am programmed to say such or because everyone says it but rather because I mean it.

“I mean, I really mean it.”

So big deal…one may think…

  • Well, it is a big deal. It is a really big deal!
  • When I answer yes to a question about TCO, my answer could seem overtly simple and it is it.
  • But the path to be able to give that answer in earnest, is far from simple.
  • My path to ‘yes’ is multi-faceted to say the least because the question itself is hugely complicated.
  • TCO is complex, can represent virtually limitless things to different companies and I know it.

“I respect it.”

  • With that always in mind, I am wholly comfortable in answering “yes” although the answer may often and unfortunately appear trite on the surface.

Reality 1: I hear the letters T C and O, sequentially, more than I can count.

Reality 2: I’ve come to realize that even the most seasoned professionals have lost bits and pieces of the meaning of each singular word that makes up the whole, TCO.

Reality 3: Along the way, the individual parts of TCOTotal and Cost and Ownership have become diluted, or perhaps a better way to put it is that they have become segregated.

Result: Collectively, the acronym has become, for lack of a better word, cheap.

What the heck am I talking about? What’s my point? 

 TCO – Let’s Break it Down

“T”

  • Let’s remember that the acronym starts with T.
  • Was this an accident? I have no idea.
  • It could just as easily have been called OCT – Ownership Cost in Total, but it wasn’t and isn’t.
  • It starts with ‘T’, I take it at face value and am exactingly mindful of its meaning.
  • The word, Total, by definition is all-encompassing, complete and absolute.
  • Total is akin to a circle, in that it has no beginning nor end.
  • It identifies, considers, and accounts for everything, and only then does it satisfy the word – Total.
  • Total doesn’t parse out pieces nor does it examine parts in isolation without consideration of how the parts affect one another.

“Total is total – Period.”

  • As such, I am obsessed with making sure that I have considered all things that make up the word Total with regard to cost.

“C”

  • In my experience as I am in the software business, Cost, when people speak about TCO is uber-weighted towards license cost, maintenance cost and service cost.
  • I agree that these are very important costs and should be fully vetted.
  • I concede to initially speaking about TCO with these costs at the core, because it appears to be the predominant way of the world.
  • However, I find that real, consultative, educational and eye-opening discussions surrounding TCO springboard from my desire to enlighten.
  • This enlightenment begins with the identification that pure, net monetary cost-savings are only a starting point that alone quickly becomes banal.
  • TCO, no matter how much the monetary savings are, only pivots into value with real “to the core” worth, when it is fortified with tangential insights – things that are often overlooked or talked about as if they are separate line items.

“Separate line items they are not. This is where we begin filling in the word Total.”

  • Conversation around licensing, maintenance and service is fairly black and white. What comes after that is truly mindful conversation where real value comes in.
  • What blows my mind is that these costs are often addressed in a vacuum as if they can stand on their own, absolutely.
  • In reality, although the above costs are without doubt important, they are just a small piece of what determines actual and real cost.
  • Opportunity costs, risk mitigation costs, migration costs, human capital costs, human resource costs, time to market costs, hardware costs…the list can go on forever
  • I want to focus on two main ones: time and people.
  • At the end of the day, I can always reply to any and all questions about positively affecting TCO with a resounding ‘yes’ because I am confident that I can save enterprises time and preserve their people.
  • These two costs, however one may want to slice it and dice it, are priceless.

“O”

  • Ownership is commitment.

“Consider something that you want to own. Make sure that your people can and want to use it while leveraging the foundation of skills that they already have and let them build on it.”

  • Commitment comes from confidence that a sound decision and process to get to such decision has been performed with complete certainty.
  • In this day and age of ‘lease’ this and ‘rent’ that, true ownership and related commitment has become elusive.
  • The current IT landscape is truly exciting, dynamic, transformative and just plain awe-inspiring.

“The world we live in now, especially as it pertains to IT, is a continuum of ‘WOW!'”

  • I am a huge proponent of experimenting with, trying, and embracing new and innovative technologies.
  • In fact, I often dream of the next IT thing that it is in all reality hard to even fathom.
  • At the same time, although I dream and dream big, I am not a dreamer.
  • There is a time and place for everything and in enterprise business; a constant dream-state is not necessarily prudent.
  • To that end, when it comes to day in and day out TCO, sometimes, it is better to buy a better, more efficient, and time-tested ‘wheel’, and resist temptation to buy into and reinvent the wheel.

 

Are SQL Relational DBMSs Here to Stay?

The continuous developments in the DBMS market are remarkable. With the introduction of each and every new flavor from NoSQL, NewSQL, Key-Value, Object-oriented, etc., and iterations thereof, virtually all data can be managed effectively and as needed.

The databases that service the widely disparate needs of storage size, horizontal scalability, performance, persistence, elasticity, structured/unstructured data, ACID and related applications are equally distinct. So the questions remain, with all of these specialized DBMSs on the market today:

“Why is the relational model still the go-to DBMS of choice?”

  1. The SQL standard, relational data model is the de facto DBMS standard for enterprises across the globe because structured, transactional data is present and necessary in all industries. Enterprises have poured extensive resources in developing optimized data architectures/schemas to literally work within the relational model. Entire enterprise-wide operations live and die by this model and to unwind that ‘cost’ is largely prohibitive.
  2. Also, there are a litany of more granular, practical reasons why the relational DBMS model, even with no drastic innovation, will be the DBMS staple for the foreseeable future. Some of these reasons are listed here:
  • Maturity, proven, battle-tested
  • Uncluttered data
  • Avoidance of data duplication
  • ACID compliance
  • Precise DML. thus avoiding data duplication
  • Facility in performing AD HOC queries
  • Functionality
  • Relatively easier to change data and perform complex queries
  • Flexibility in adding and removing specific data
  • Security/segregation of who can access what
  • Interoperablity
  • Ease of Migration

“Isn’t it likely that disadvantages of the relational model such as performance, CPU consumption, cost/upkeep, data complexity from poor architecture, etc., will logically lead to its obsolescence?”

  1. Without innovation and improvements on relational DBMSs, it is conceivable that ‘eventually,’ it will fade away as a relic.
  2. However, innovation and transformation are constantly occurring with the relational data model at its core.
  3. What this solves is a ton of problems that were fundamentally known to be ‘intrinsic limitations’ of the relational model.
  4. Some examples of the ‘intrinsic’ constraints and new innovations to address such, are:
  • Performance: With the advent of in-memory databases, SSD and the combination of the two, the performance hindrance has largely disappeared.
  • CPU consumption: With Moore’s Law at the core, traditional limitations placed by CPU consumption, physical storage consumption, interactive data management (such as joins) are greatly mitigated by the drastically lower monetary costs associated with scale-up (vertical scaling).
  • Cost/Upkeep: Although the initial costs to design an efficient and adaptable relational data model is still higher than some other data models, the cost-savings associated with but not limited to minimizing inconsistent data, problems that stem from eventual consistency, interoperating with enterprise-wide platforms, migration and near-ubiquitous SQL understanding, reliable support etc., primarily outweigh the initial cost-hurdles.
  • Data complexity: Data has become increasingly more complex and while the traditional relational data model has not adapted to such changes easily, times are changing rapidly. There are now wide-reaching solutions providers that specialize in integrating newer, more complex data structures into the relational model. Additionally, as the DBAs and design architects enjoy the benefits of robust documentation on relational data, many of the ‘once known as’ hard and fast exclusionary rules have systematically lessened.

This post is certainly not meant to be an exhaustive dissertation on all aspects, weaknesses and strengths of any new DBMS types. What I simply want to convey is that there are definitely benefits for specific applications from the amazing DBMS types that we have today. It is a huge benefit to the entire industry and innovation on the whole.

I will wrap this up by saying that Altibase, which is a SQL compliant, in-memory DBMS with hybrid architecture has already solved many of the issues surrounding performance (In-memory DBMS), storage and flexibility (on-disk DBMS), durability (combining in-memory and on-disk in a single unified engine) and we will continue to innovate to make sure that the relational model is here to stay.

A Top 10 Tech Trend 2 Years Ago – In-Memory Databases are Now a Must

In the not too distant past, in fact, quite recently, many chuckled at the notion of RAM being used as a viable storage medium for database management systems. Boy, have times changed.

Database

I have heard it all:

“RAM is way too expensive, conceptually it sounds great, but it will never be used in the real-world.”

“Our data is important to us, so RAM will never work because if the power goes out, we lose all of our data.”

“I don’t need that kind of performance.”

According to a recent article in InformationWeek, 10 In-Memory Database Options Power Speedy Performance:

“Yes, the death of the conventional disk drive has been greatly exaggerated, but Moore’s Law has brought down the cost of RAM so dramatically that in-memory technology is getting to be downright pervasive.” –Doug Henshchen, Executive Editor of InformationWeek

“The earliest purveyors of in-memory databases included Altibase, solidDB (recently divested by IBM), and TimesTen (acquired in 2005 and still owned by Oracle). These products emerged for niche applications such as telecom, financial trading, and high-speed e-commerce. Today these products are seeing broader use, branching into analytics, big data, gaming, and Internet-of-Things-style applications.” –Doug Henshchen, Executive Editor of InformationWeek

“From Altibase to VoltDB, and covering options from IBM, Microsoft, Oracle, and SAP, we wrap up leading in-memory databases and add-on options. When you need speed, here are 10 tools to choose.” –Doug Henshchen, Executive Editor of InformationWeek

The reality:
Companies that are early to embrace in-memory DBMS technology will have a leg up on the competition. From real-time inventory control, compliance/policy control, defense and financial trading to authentication, marketing communications management and geo-location tracking, in-memory databases are taking over. In-memory databases still have a storage limitation defined by the amount of RAM able to be stuffed into a machine. However, single servers have the ability to handle 6 TBs of RAM, and this limitation is rapidly fading away.

The storage limitations do constrain some use cases that involve data warehouses and other massive petabyte scale deployments, but by and large, thousands of common use cases benefit from in-memory performance. This phenomenon will only grow over time.

With over several hundred in-memory database options available today, it is still remarkable that only a handful preside as the best of the best.

Altibase prides itself on its maturity, being one of the first to come to market in the world. With its humble beginnings 23 years ago as an academic research project, it is amazing that it is now the in-memory DBMS vendor-of-choice for nearly 600 of the top enterprises across the globe.

Nobody gets fired for hiring Oracle. Or will they?

Oracle: Bernstein Ups to Buy, Cloud Clearing Up; Credit Suisse Lauds In-Memory Initiative.” **

As in-memory database management systems are becoming more and more mainstream with each passing day, there are many questions that challenge the status quo.

What was once an easy decision that revolved around: “use Oracle,” “use IBM,” “use Microsoft” and you are covered, is no longer the pillow of job security as it once was.

Gartner’s CEO noted in a recent keynote speech that the leaders of today (mentioned Oracle and SAP) will no longer be the leaders inside of 6 years.

C-Level executives are now challenged with mind-numbing pressure to steer their companies to the forward-looking data management strategies and vendors that will certainly involve in-memory databases – but the question lurks – which one?

We live in a new “hyper-connected” world that revolves around truly breathtaking data processing speeds, but the fundamentals of database management systems have not changed. Choose experience. Choose maturity. Choose wisely.

Playing it safe is no longer an option by betting on the big names for namesake alone. The unfortunate reality of many of the big names in the database space is that they acquire technology to attempt to keep pace. The net result is akin to betting on a race after it’s already over.

Vendors, no matter how large, that are chasing the flavor of the day or are redefining their core business models will fade. No one in this space is too big to fail.

To thrive, C-level executives have to play it smart and invest in those that have defined themselves in the in-memory space.

Database comparisons

** Barron’s Blog Tech Trader Daily; May 5, 2014, 12:42 P.M. ET