Well 2009 has been interesting, especially in CMS space. Its all about cost control and compliance right now. Companies are under direct and extreme pressure to stay above water as consumer and business spending is down. Companies have slashed budgets, minimized inventories, and cut back head count in an attempt to look good to shareholders and the market. In these times where the importance of efficiency is high on list, it would seem that demand for CMS, ECM, and WCM solutions will increase. CMS solutions do present a viable option to improve efficiency within web development, document management, records management, etc. But large CMS projects and software solutions are expensive, not in licensing, but mostly in people and time. Company leaders are under pressure to add value now, not in 2 years, therefore CMS projects need to think of this as well. From 2000-2005, we saw the famous ECM arms race, as companies looked for the one-stop-shop to handle all information management. However, most ECM products cannot achieve best-in-breed status across all elements of content management (document management, records management, web content management, etc); therefore the holy grail of a total ECM solution is yet to be found.
Putting these factors into consideration, I have put together some thoughts on how CMS will trend in 2010.
- Federation over Centralization: I always love the line “the best repositories are the ones you have.” This is where I see CMS going. CMS solutions will need to continue to grow in integration capabilities and function as the connective tissue between federated repositories. The opportunity cost of moving large legacy repositories of one format into another centralized CMS is high. Centralization is an attractive term from a operations management perspective, but most often not technically viable. Some CMS evangelists always promote centralization and consolidation when it comes to content management practices, but this not really realistic. For example, many companies will continue to leverage different solutions and packages across their CMS stack, these product will continue to need new ways of talking to each other.
- Cloud Options: With the economic downturn in play, cloud solutions will grow in adoption, especially in Web Content Management + Marketing scenarios. The cost saving and time to market considerations will provide significant pressure to try out these options. Many products are looking for examples on how o install and configure their products within Mosso, Amazon EC2, HP, IBM, or other cloud hosting providers.
- CMS + API + SOA (Rest, JSON, XMPP): Content will need to continually pushed and pulled to and from more sites, channels, and mobile device. Content will need quick and easy means of integrating into widgets, apps, iphones, android apps, etc. As a result, APIs and SOA for content services are critical for strategic positioning.
- WCM + Analytics + Targeting + Testing: WCM will continue to expand into the complete experience around content. Especially how content is performing, targeted, and delivered within experiences across multiple sites and channels. WCM vendors will continue to acquire and establish partnerships to expand their offerings.
- Faceted Search: As federation expands, faceted search with grow in importance to search and locate content via filtering and metadata.
- Open Source will expand: Open source solutions will grow in adoption, especially in social networking and content distribution scenarios. The adoption of Drupal by the http//www.whitehouse.gov is dispelling the myth that open source cannot scale and provide an enterprise level solutions.
Faceted search and navigation has been mainstream for a while now in the larger eCommerce sites. This was partly driven by both the faceted nature of product data (i.e., most products have a type, brand, price, etc.) and the availability of the data in retailers’ existing information systems.
Interestingly enough, even though the technology is there, the use of faceted search and navigation in mostly content sites has been lagging. However, in the last year, we have finally seen an uptick in the use of this pattern beyond commerce sites. With the redesigned Bing search engine really leveraging this concept, and driving some of the innovation around search, I believe we will see the concept become a standard practice on sites that have large amounts of content.
One additional contributing factor is that there is also some traction around the lower end and open source market. Although vendors like Endeca and Coveo have been providing this capability for the enterprise for a while now, open source and low cost alternatives are emerging as well.
For example, Apache Solr is getting a lot of traction recently, and Acquia launched a hosted faceted search capability for Drupal earlier this year.
This is an exciting development, and I believe we will see significant improvements in site search in the future.
By Matthew Johnson
Senior Technical Architect
Razorfish CMS Center of Excellence
As we have seen over the last 5 years, ROI is one of the most important performance measures and justifications of large enterprise CMS projects. The primary measure of CMS performance can no longer be solely based upon efficiency (automation and centralization of content.) Now, “actual content performance” measures the success of a CMS. Content performance relies on how relevant a company’s content is to its target audience/customer base. Continual testing, optimization, and delivery of content are vitally important for companies looking to improve conversion ratios, usability and ROI. It has been proven that small changes can have huge impacts on conversion, sales, etc. Through multivariate testing, companies can quickly identify and adjust content to meet current demands and quickly adopt a test and learn methodology to refine content. However most CMS vendor focus on the repository and display of content, not the complex process of leveraging analytics to refine, adjust and deliver content based on real-time scenarios. Every CMS product on the planet can pretty much do the basic CMS functions (storage, metadata, presentation templates, basic workflow, etc…), however most do not have A/B, multivariate-testing capabilities.
A/B testing: compare two different versions of a page or content location.
Multivariate-testing: simultaneously measures the performance of several content variations on a page.
However, this is all changing, large CMS companies are seeing this capability as a key blue ocean feature. This year Autonomy/Interwoven and Sitecore added optimization and targeting capabilities to their CMS platforms. Even Analytics vendors are arming up with this capability; WebTrends acquired Widemile a leading provider of multivariate testing and site optimization technologies and services.
I think the role of a CMS will change to:
A holistic CMS/Targeting solution I believe will give many organizations a competitive advantage in delivering the best in breed content to their targeted audience.
Image via Wikipedia
Drupal is a great open source CMS. One of the reasons I like it is the strong community around the platform. There are many different modules available for all kinds of purposes, ranging from forums to integration with salesforce.com. Secondly, many vendors releasing new products are creating plugins for this platform to generate traction, such as Kaltura , OpenX and Mobify.me. With this kind of community support, an open source CMS solutions have a leg up in how easily new features or trends can be incorporated in a web site. Other open source solutions like WordPress have these same benefits as well.
It is interesting to see that there is now also some community traction around enhancing the user experience of the Drupal administration functionality. This blog post shows some great ideas around improvements than can be made, while still building upon the foundation that is currently in place. The Drupal 7 User Experience project outlines similar strong ideas. Incorporating ajax functionality such as drag and drop within the admin user interface would make Drupal a CMS that can become much more user friendly, allowing business users to really control their site.
Seth Gotlieb has an interesting blog post on CMSWire about the CMS software selection process. He argues that a requirements matrix, and evaluating and scoring against these requirements is a wasted effort.
He makes some valid points. With any analysis that involves numbers, it is always dangerous to assign to much value to the actual number and its precision. In addition, it is easy to get lost in the details, and losing oversight on the big picture.
From our experience, the requirements matrix still plays an important role. It ensures that all different areas are covered, from end user to IT needs. The matrix creation process also triggers important discussions with the different stakeholders.
However, equally important in the selection process are vendor demonstrations. They give users a good perspective on the solution, forces them to think about how a specific product would work within their process and environment. The follow on discussions then provide a forum to identify any red flags with specific software packages.
Once the leading products have been evaluated and scored, vendor demonstrations have taken place and discussed, we take a step back and look at the overall results. Based on these elements, a decision can be made that meets the majority of requirements, and also is the right fit for the organization.
Many CMS vendors have been busy incorporating richer targeting functionality into their CMS product. For example, Interwoven acquired Optimost and also incorporates targeting capabilities in its LiveSite product, Sitecore is coming with an Online Marketing Suite , Tridion has its Unified Online Marketing Suite and Fatwire has had targeting capabilities for a long time. At the same time, analytics vendors are now in the same market, with Omniture offering Test & Target, Google offering Website Optimizer , and yesterday Webtrends announcing they are acquiring Widemile.
Obviously, CMS vendors need to move beyond basic content management. Offering targeting capabilities is very attractive, as it is an area where CMS vendors can claim direct results and clear ROI, justifying the investment companies make in buying their products…
Razorfish just released a great report at http://fluent.razorfish.com. This report shows the ever growing importance of social media within the digital landscape. An interesting topic that is hard to grasp the is how enterprises will gain control of social media and centralized federated content into one holistic or a mashup of content repositories. As social media expands, a greater importance of archival, compliance, retention will grow in the future as content expands. We are seeing content growth within organizations at 100+% year over year. When you think of fortune 100 companies, that amount is massive in sheer size. Many solutions in play such a Autonomy/Interwoven, SharePoint, Filenet, etc… just do not have a clear method of how companies will centralized content being distributed across social media and other off domain channels. Content is becoming more segmented and portable, but how will organizations retain and process all the vast entries of user submitted comments and content. Content is becoming federated and spread across channels, but how will content phone home and be stored?
by Matthew Johnson
I just wanted to provide a brief reply to this basic question I get from various clients. Why should I use a CMS? Many clients who are new to CMS/WCM, view the solution as just another repository or database. I continually get this response, why can’t we just put the content in Oracle, or SQL Server etc… well you can… but that is not the primary objective of a CMS and it will not get you the economies of scale in content management. A CMS is supposed to enable a “Management System” which involves:
- Operations Management & Governance: (Strategy, Policies, Roles, Responsibilities, Workflow, etc)
- Content Development: (Content Editing, Reuse, Search Engine Optimization, Channel Agnostic and Specialization)
- Content Repository: (Content Storage, Source Code Storage)
- Content Distribution: (Cross Channel / Cross Infrastructure Deployment, Integration Services, Transformation/Template Services)
Many companies jump right into repository and integration mindset. AKA, how do I store some content and get it onto the site. Well this is a great idea, but to get value out of a CMS, you must set up the operational infrastructure and governance model to sustain a CMS. If you focus on repository/integration, you are only solving specific technical problems for most often one defined channel like one web site, not larger scale and business specific operational needs. Most often this approach involves a reduced set of resources that know how to store and integrate the content for a specific channel. The result is a CMS that can be leveraged by only a certain subset of resources and then limits the reuse of content. I do not know how many times I have seen a multi million-dollar CMS implementation that only can be used by 4-5 people and only use 30% of a CMS’s capabilities.
A CMS is supposed to empower your Content Management Model to allow more people to participate in the content process, whether its content production, viewing, reuse, and deployment. A properly deployed CMS solution will also enhance a short time to market for content integration/distribution into new channels.
Key questions to ask yourself before you begin big CMS projects are:
- Who participates in CMS requirements? (It should not be controlled by IT only)
- Have you developed your Content Types and Content Hierarchies?
- Have you defined your roles and responsibilities (Who are editors, content contributors, QA, support)
- Have do defined your existing workflows and how they can be mapped to a CMS?
- Is you content semantic and portable?
- Are you planning to transform your content so it can be used cross channel?
- Do leverage in context view of content?
- Do you use Data Capture and Presentation layer template technology?
by Matthew Johnson
The last decade has been a time of radical change in content ecosystem. Users used to primarily consumed branded content thru one or few of “on-domain” channels like somecompany.com. This hub and spoke ecosystem focused on how to lure as many potential or current users to on-domain properties to consume content that conveys the company’s value proposition. Companies presented online users with emails, micro-sites, and advertising with the goal of getting as many users to click thru into their “walled garden” (large .com sites). Within these large walled gardens companies could leverage web content management tools to share/distribute large blocks of html and metadata across organizational silos. However, times have changed, the advent of web 2.0 and social media changed how content is being consumed. Walled gardens that follow traditional on-domain methodologies are being torn down and in its place are social media, open networks, new aggregators, API frameworks, and widgets. A critical success factor is no longer based on getting people click thru to on-domain content but how well content has spread virally across the Internet using a channel most relevant to a user.
However, many CMS solutions unintentionally bind content to specific channels of consumption, aka blabla.com. Content Types created in Interwoven, Drupal, Alfresco, SiteCore, and other products need to be strategically created to anticipate and plan for new and developing channels of content distribution. Content types needs to house semantic and simple content to alleviate distribution across news sites, facebook, mobile apps/mobile web, and emerging channels like XBoxes, Digital Signage, Set top boxes, podcasts, etc.. I see the issue over and over again, companies create a global content types to meet a current or generalized need and then start up the content production line using dreamweaver or WYSIWYG editors. Content which looks like it can be reused, cannot be reused because the content types do not provide the metadata, guidelines, governance, structure to give the content agility. Content should always try to be agnostic of channel. Content should be so semantic that is can be easily dropped and restyled using CSS or XML transformations. I can say it over and over again, its all about the content types and their hierarchies. CMS projects should not always jump straight into technical integration and performance testing. True performance is how you expand the reach of your content across the web, not drive to a specific site.
by Matthew Johnson
A common conflict I am seeing across companies is the clash between the CMS needs of B2C or external web 2.0, social media, Internet infrastructures and internal B2B/B2E Enterprise 2.0 Enterprise content management efforts. While I do see a convergence within the Enterprise 2.0 and CMS markets, I still do not see a enterprise cohesive solution for Web 2.0 and CMS that bridges the gap between internal and external efforts. I see many companies are still looking for the holy grail of solutions that will fulfil all their needs (B2B, B2E, B2C) but the fact is (and I am sure many people will disagree), solutions such as Microsoft Share point and other ECM tools that excel within the intra-nets, often fail to meet expectations within in the B2C Internet scenarios. However, this holy grail will needed when it comes to holistic compliance and legislation requirements in the future.
Internal efforts within companies can live with canned, generic solutions, while B2C Web 2.0 solutions needs to be unique, cohesive, and sticky. Now I know that the evangelists of Microsoft, Documentum, Drupal, and Joomla will all say that these solutions can meet that need. But the problem I see within the market is speed. Almost of the time these solutions needs at least three to six months to get a site online that meets all the needs of creative, user experience, legal, compliance, analytics, and most importantly the business. I wish a solution was developed using the API first methodology where a CMS was not bound at all by platform, the entire engine is web service, REST, JSON driven. This will allow the entire infrastructure to be totally segmented from presentation and be placed in the cloud not be tired to .Net, Java, PHP, Ruby, etc. I know this is a rant…but where is it?