On The Insider: Behind the Scenes on "The City"
BNET Business Network:
BNET
TechRepublic
ZDNet

December 24th, 2008

New year fast approaching, and so is the next-generation enterprise data warehouse

Posted by James Kobielus @ 5:49 am

Categories: Data warehousing

Tags: Appliance, Amazon.com Inc., Microsoft Corp., I&KM;, MPP, Business Intelligence, Databases, Service-Oriented Architecture (SOA), Tools & Techniques, Data Centers

This has been one of the most pivotal years in the evolution of the enterprise data warehousing (EDW) market. Every EDW vendor of note has firmly repositioned its go-to-market strategy around the appliance approach, with some also taking tentative steps into what is sure to be a key theme in 2009 and beyond: EDW in the “cloud.”

Yes, much of the recent “cloud” buzz has been bleeding-edge IT trade-paper fodder. It’s all interesting, to be sure, and there’s plenty of innovation going on. But much of the discussion seems to be a renaming, repackaging, and mashing up–with subtle twists and tweaks–of such well-established themes as service-oriented architecture (SOA), software as a service (SaaS), virtualization, and Web 2.0. And much of the trendy attention to cloud services obscures that fact that public cloud offerings from Amazon, Google, Microsoft, and others are still primarily a work in progress.

Forrester clients have only recently begun to inquire about cloud topics in earnest. But still, it’s hard to stay cynical about cloud computing for long. This solution delivery approach is coming almost certainly, inexorably, to all IT solution segments, including to EDW and business intelligence (BI). Sure, I&KM pros can safely ignore much of the cloud buzz for now, but, a year ago, they said the same about EDW appliances, and look how quickly appliances have become a dominant deployment approach in this market.

In the EDW-related inquiries I take from Forrester information and knowledge management (I&KM) customers, a great many concern the appliance market. CIOs, CTOs, DBAs, and other professionals are actively considering various vendors’ appliances to replace, or at least to supplement and extend, their traditional “roll-your-own” EDWs. Typically, the I&KM pro wants me to help them select the best EDW appliance for their needs from any of several vendors, both venerable blue-chip and sexy start-up.

What validated appliances for I&KM pros this year was the fact that big-name EDW vendors–including Teradata, Oracle, IBM, Microsoft, and Sybase–have gone this route in earnest. The inflection point for the whole industry was this fall when Teradata–who effectively established the EDW appliance space years ago but had long resisted going to market under that label–embraced the approach and significantly expanded their appliance solution portfolio.

EDW cloud services are still a few years away from a similar inflection point. The leading EDW vendors have taken only the most tentative steps into the still-embryonic cloud services market. But they are all beginning to explore the cloud/SaaS channel with greater interest. They simply have to. Customers’ capital budgets are under severe pressure, and a multi-million dollar EDW solution–be it a premises-based appliance or what have you–is a tough sell. In a soft economy, any on-demand pay-as-you-go offering becomes more attractive across all customer segments. Just as important, the increasing scalability, performance, flexibility, and availability demands on the EDW and BI infrastructure are spurring many users to consider managed, hosted, outsourced offerings with fresh interest.

We’re starting to see the next-generation cloud EDW emerge. One noteworthy development this year was Oracle’s partnership with Amazon. Under that agreement, Oracle customers can license Oracle’s core EDW software stack to run in Amazon Web Services’ Elastic Compute Cloud (Amazon EC2) environment. They can also use their existing software licenses on Amazon EC2 with no additional license fees.

Another important development was Microsoft’s announcement of its Windows Azure cloud initiative, of which one key component is the (still in beta) SQL Server Data Services (SSDS) subscription offering. When Microsoft SSDS goes into production in 2009, it will offer some basic DW/BI features in addition to transactional database support. Though SSDS will not initially be at a functional par with they licensed SQL Server offerings, it is clear that Microsoft plans to evolve it rapidly toward becoming a feature-competitive DW cloud offering over the next several years.

Microsoft also made a key EDW-related acquisition this year, appliance pure-play DATAllegro. Forrester expects this acquisition to figure centrally into Microsoft’s evolution of SSDS into a massively scalable cloud DW service. Though DATAllegro did not achieve much market adoption as a DW appliance pure-play, it provides Microsoft with a robust scale-out technology called “shared-nothing massively parallel processing” (MPP). By the way, Microsoft is playing catch-up in this regard, since most of its closest competitors implement shared-nothing MPP in their EDW premises-oriented solutions, and such DW appliance pure plays as Netezza, Greenplum, and Vertica also implement it to varying degrees.

When Microsoft ultimately ships a DATAllegro-powered SQL Server EDW appliance under its “Project Madison” in a year or two, we would not be surprised to see it adopted first in SSDS. The cloud EDW offering would benefit immensely from shared-nothing MPP’s ability to manage petabytes of analytic data and parallelize queries and other transactions seamlessly across a grid of hundreds or thousands of nodes.

Indeed, the industry consensus is largely in favor of shared-nothing MPP across the storage and compute tiers, coupled with flexible information-as a-service (IaaS) and server virtualization, as the principal platform for cloud computing. In the next-generation EDW, shared-nothing MPP allows the infrastructure to become more fluid, flexible, and virtualized, while managing ever more massive data sets and providing the agility to handle more complex mixed workloads of reporting, query, OLAP, data mining, data cleansing, transformations, and other functions.

As 2009 approaches, we’ll see pure-play DW cloud vendors come to the fore, appealing both to the early adopters among I&KM pros as well as to those under severe budgetary, headcount, and data center constraints. The established EDW vendors will not come to the cloud in a big way till 2010 at the earliest, it appears. But they will come, and with all guns blazing.

In 2-3 years’ time, the established vendors will own the EDW cloud space just as they’re starting to own the rapidly maturing appliance segment–in part, by acquiring the most promising cloud startups. The longer the economy stays drab and dreary, the faster the cloud EDW segment will expand, mature, and consolidate.

As that happens, commercial EDW cloud offerings will become as diverse and feature-rich as appliances have become, and I&KM pros will almost certainly ramp up their cloud EDW inquiries.

December 12th, 2008

Predicting the battle over collaboration infrastructure in 2009

Posted by Gil Yehuda @ 11:24 am

Categories: Collaboration, Web 2.0

Tags: Information Technology, Business, Collaboration, Groupware, Strategy, Enterprise Software, Software, Management, Gil Yehuda

It’s always the short questions that make my job interesting. Like this one.

Gil, do you think companies will cut back on Enterprise Web 2.0 in light of the economy?

First reaction–it depends. I’m an analyst, that’s always our first answer. But what does it depend on? What are all the factors at play and how will this impact your decisions?  So, here’s my read of the Enterprise Web 2.0 trends based on many conversations with my clients and vendors. I will focus specifically on wiki and social networking tools used to improve internal collaboration and knowledge sharing. These are gaining momentum and acceptance within the enterprise. (See my TechRadar report for the details on what Forrester sees in scope for Enterprise Web 2.0.)

There will be a slowdown of IT-driven collaboration projects in 2009. But there will be increased interest in business-driven collaboration projects. Why? There is a technology populist movement, and has been for a while. Small and medium-sized businesses (SMBs) typically operate with little IT support and rely upon vendors for collaboration services – nothing new here.  But we find that business units in enterprises, especially those in companies with politically weak IT departments, are increasingly behaving like SMBs, and they are going out and provisioning technology on their own. This is a form of institutional Tech Populism.

IT departments react by trying to block the business from getting software services from the cloud. And for good reason.  It’s hard to manage disconnected islands of infrastructure. Furthermore, it circumvents IT, adding complexity and risk to their job–especially if they were not involved in selecting the service. Anecdotally, we find that workers are now spending more time working from their home computing environment in order to access the blocked online productivity sites.

In response, IT says “Ok, we’ll provide you what you need so you don’t have to go behind our backs.” In parallel, the traditional brand name ISVs are bolting collaboration features onto their existing enterprise platforms.  IT departments want to minimize operational complexity, so they will be attracted to wikis and social networks provided by vendors with familiar names–regardless of their functional adequacy.

Will it work? In an ideal economy where IT has a budget, maybe.  But what about now?

I predict that IT-driven internal collaboration initiatives will be squeezed tight: 1. they are usually more expensive than the Tech Populist options. 2. IT is being asked to sacrifice projects, and they would rather cut fat, not bone. Meaning, they’d rather protect their bread-and-butter IT infrastructure from being outsourced. And 3.The business considers projects initiated by IT to be less vital. Remember who pays the bills.

However, for business-driven internal enterprise Web 2.0 collaboration projects, I see growth. Why?  Because the business will find their collaboration needs to grow in 2009, while they see IT providing them with fewer services. Collaboration needs grow as a result of layoffs, mergers, and deepening external partnerships (requiring new infrastructure to collaborate outside the firewall with trusted, external partners).  And this happens while IT’s services shrink as a result of layoffs, a focus on streamlining operational costs, while not taking on new projects.

Who wins? The SaaS based collaboration vendors: folks like Box.net, GroupSwim, Jive, OneHub, PBwiki, SocialCast, Socialtext, and others who provide collaboration services in the cloud for about $5-$15 per user per month, give or take. These products range in functionality, where some focus on the wiki, others on the social network, and still others are more suited for file sharing within trusted groups. But these are easy pickings for business that are looking to circumvent IT and set up a small departmental solution. Especially in departments that are looking to collaborate with a few external partners.

The good news is that when the economy picks up, IT and the business can have a heart-to-heart talk and make some decisions about the future of the SaaS based content.  Some will leave it in the cloud if they continue to like the prices and features.  Others will revisit the brand-name collaboration options that are provided by the ECM and portal vendors. By this time, many of these options will catch up in functionality, providing a solution that will make both IT and the business happy.

And now, the battle. The story above works for companies that are willing to move to SaaS based products to address near term collaboration need in 2009. But many organizations cannot, or will not, allow themselves to house their intellectual property on someone else’s servers – no matter what the vendor says to assure them. This means that organizations with hard-line IT shops will face a battle between IT and the business for a collaboration solution that integrates with IT’s existing infrastructure, but requires little IT involvement.

Now, there are good on-premise collaboration tools in the market today that are poised to solve this battle – depending on what your IT infrastructure is. The challenge I see is that many of these vendors are not sure to whom they are selling their solution – to the business, IT, or the fragile partnership between the two. Remember, partnerships encounter stress during tough economies. It’s hard for IT and the business to work together when they are eying each other’s budgets.

You see my thoughts, so please share yours.  Given the current economy:

  • Will you cut back on Enterprise Web 2.0?
  • Will you deepen the divide between IT and the business?
  • Or will you try to form a stronger partnership between the two?

December 10th, 2008

Mark Twain, US Politics, and the state of enterprise BI

Posted by Boris Evelson @ 5:02 pm

Categories: Business intelligence

Tags: User Satisfaction, Pricing, Business Intelligence, Tools & Techniques, Databases, Enterprise Software, Marketing, Software, Data Management, Management

If you haven’t yet heard the latest news on the American political scene, let me fill you in: Illinois Governor Rod Blagojevich has been arrested on charges of conspiracy to commit fraud and soliciting bribery. Among the alleged offenses is that the Governor planned to sell the Senate seat of President-elect Barack Obama to the highest bidder, or, if no offers met his expectations, to take the seat for himself for personal gain. One is reminded of the remark, often attributed (perhaps incorrectly) to Mark Twain, that the United States has “the best politicians money can buy.”

Putting witticisms aside, this news does remind us all that spending more money is no guarantee of success. When we say someone has “the best X money can buy,” we don’t mean they have the best, period. There’s always room for using what you buy more intelligently, effectively, and efficiently, or finding a hidden gem at bargain prices. Information and Knowledge Management professionals often delude themselves into thinking that spending more money on enterprise applications, especially business intelligence, is the key to success, but that’s simply not the case. Our latest survey (the results of which will be published soon) shows that most large businesses have at least three, and often many more, BI tools in house, often with no plans to consolidate - and larger enterprise deals for BI can run well over six figures. Add to those license costs and maintenance fees the price of associated professional services (which can be, anecdotally, up to five times as much as license fees for a given project), and it becomes abundantly clear that enterprises are spending plenty on “the best BI tools money can buy.”

However, are they really getting the best BI, or just expensive BI tools? User satisfaction is still mediocre at best: our latest survey shows that BI applications are hard to use, trust in the data is poor, and IT is too slow to respond to requests for new reports or data sources. Additionally, in spite of the best efforts of analytics evangelists, BI is still mostly relegated to back-office efficiency gains, instead of being used to drive the strategy of the business.

Our client interactions constantly show that, while the best BI tools are a necessary component for the best BI implementation, other components are equally as or even more important than the BI tools themselves. Business led data governance, integrated MDM and metadata processes, finely tuned Business Intelligence Solutions Centers, and end user BI self service, are among just a few non-technical components that make up the best BI money can’t just simply buy.

What do you think? How can we move from “the best BI tools money can buy” to “the best BI?”

December 3rd, 2008

Extranet collaboration platforms - coming soon, but many-to-many problem remains

Posted by Ted Schadler @ 11:21 am

Categories: Collaboration

Tags: Extranet, PBwiki, Thomson Reuters Messaging, Networking, Collaboration, Groupware, Enterprise Software, Software, Ted Schadler

In our conversations with many information and knowledge management professionals, it’s clear that their distributed and multicompany teams need better extranet collaboration tools.

And they feel the problem is only getting worse as companies go virtual, global, distributed, outsourced, green, travel-less, and partnered, thus driving the need for ever-better collaboration tools that work outside the firewall.

Trouble is, the messaging and collaboration services that  companies have implemented are designed primarily for internal teams.

For example, it’s bloody difficult to set up a secure instant messaging connection with every partner you might want to work with. Such interoperability between IM platforms is technically possible, but operationally nightmarish.

So clever employees do what they must: Use public IM and calendaring services, cobble together conferences from piece parts, and fall back on endless scheduling and sharing emails and voice conferencing. Ugh. Ugly. And scary.

Well, the solution’s just around the corner say vendors new and old. After all, many are on the cusp of major product releases that promise much better extranet connections and capabilities:

  • IBM Bluehouse promises a new extranet collaboration platform.
  • Google already offers an extranet collaboration toolkit in its Google Apps Premier Edition.
  • Cisco is adding extranet collaboration capabilities to WebEx.
  • Microsoft is moving its services into the cloud for easier extranet access.
  • PBwiki is already cloud-based and ready for extranet collaboration.
  • The extranet collaboration toolkit list goes on: Veodia, Forterra, Dimdim, Qwaq.

That may be true. I surely hope so. But I fear that we have a few more hurdles to clear before extranet collaboration becomes as straightforward as internal collaboration. The basic problem is the many-to-many combinatorial problem. (The mathematicians I know call this a combinatorial problem, and its solution scales exponentially with the number of companies, nee people, involved.)

In addition to all the important stuff to support multicompany teams — conferencing, video, shared calendars, team sites, persistent chat, search, shared documents, unified communications, structured processes, etc. — these structural problems must be solved:

  1. An extranet collaboration platform will have to be set up in the cloud. Using a cloud-based provider turns a many-to-many exponential problem into a many-to-one linear problem. But one of the partnering company must still own and control the platform services.
  2. The many-to-many permission problem must be solved. How do to manage access control when teams form and disband and companies sometimes partner and sometimes compete? The access control tools need to get easy to use and integrate well with existing corporate directories.
  3. IT will have to grow comfortable with new access control dashboards. Thomson Reuters Messaging has solved this problem for some industries, giving individual companies control over what their employees can do on the extranet collaboration platform that it hosts.

November 24th, 2008

Two paths for information management pros in an economic downturn

Posted by Matthew Brown @ 10:58 am

Categories: Uncategorized, Information Workplace, Economic downturn

Tags: Information Management, Business Technology Team, Strategy, Management, Matthew Brown

There’s nothing like an economic downturn to catalyze change in information management (IM) strategies. Someone asked me recently, “Isn’t information management a discretionary spend that will likely get cut in an economic downturn?” He cited the fact that tools for collaboration, content, and business intelligence are traditionally difficult to financially justify, “even when times are good.” He’s right about the latter - most Forrester clients struggle with assigning specific financial value to information management investments. In fact, as Forrester learned recently, many companies can’t even tell you what it costs them to run their own email. Similarly, over 90% of information management professionals we surveyed recently reported having no way to measure whether employees are even using their corporate portals, let alone realizing financial benefits from them. Yet I doubt the measurement difficulties will necessarily lead to prolonged decreases in investment overall. Two reasons:

  • Most information management initiatives are understaffed today. Human labor is typically the largest direct cost associated with information management programs. Yet I’m frequently surprised at how few skilled people - including project managers, business analysts, information architects, corporate librarians, and others - are actually staffed full time to programs. Instead, portal, search, business intelligence and content management initiatives often include a small core team in IT that struggles to keep up with demand for new applications, electronic forms, workflows, reports, and tools.
  • Demand for productivity gains and process improvements will grow in a down economy. Several companies that have put off investments in workplace productivity tools are now coming to Forrester saying the economy is giving them a reason to invest. This is counterintuitive, but it shouldn’t be. After all, it makes sense that when times are good, we don’t pay much attention to how we could be working better and smarter. One IM professional at an automotive electronics provider in the US agrees, saying his team recently partnered with the business to identify over 250 productivity and cost reduction opportunities in their core businesses. The combined business technology team is now evaluating how shared workspaces and virtual meetings can drive productivity gains for workers.

So it strikes me that the downturn could take one of two paths for IM pros. Those that can’t communicate the connection between information management practices, workforce productivity, and business process will find it increasingly hard to fund new projects. Many will be asked by the business to build more transparency into their costs - across staff, software, and hardware expenditures - in order to justify their very existence. Conversely, those information management professionals that can articulate the value of information management tools and practices, may just find themselves helping to pioneer substantial changes to how people work. After all, it’s a lot easier to bring about change when there’s burning reason to do so, like this economy. Forrester is going to explore this divide in more depth though a series of interactive teleconferences next week. We’re interested in hearing from other IM professionals who are using the down economy as an opportunity to catalyze change in their information management strategies and elsewhere in IT.

November 19th, 2008

Enterprise mashups need complexity to create value

Posted by Gil Yehuda @ 2:38 pm

Categories: Web 2.0

Tags: LinkedIn, Mashup, Collaboration, Gil Yehuda

Those who drink the Web 2.0 Kool-aid live in a idealistic world where we can mentally connect a great idea to a great implementation of that idea. We live on faith that the great implementation will come, since there are plenty of smart people out there who will eventually figure out how to make value out of technology building blocks. Sometimes our faith is tested when the killer-app does not show up for a long time. But evidence can restore our faith.When I first saw mashups, I thought they were pretty cool. The canonical examples of this technology were all about the placement of data points onto a map. With mashups you can visualize where Fortune 100’s top companies to work for are located, and you can find a mailbox nearby. It’s certainly nice to use once in a while, and maybe worth bookmarking. But will this alone transform a business? Likely not.

Sharing location information via an online travels social network, like Brightkite, FireEagle, Bluenity, Dopplr, or TripIt, is also pretty cool. Using the integration hooks into Twitter, Facebook or LinkedIn, I can begin to share location information with those I trust. I can post where I am located now, and where I plan to be in the near future. I believe there could be value in this activity, but it’s not yet transforming the way I use information into business results.

But I am now beginning to see how these services combine to build a useful application. The “aha” moment for me was when I stopped looking for one mashup, or one sharing app, and saw how companies can combine multiple streams to create value out of data. What if you take a map mashup, a calendar mashup, a travel micro-share feed, an events feed, and dataset from a CRM system containing the names of locations of my customers? I trust that smart people can take these and create value. Why? Am I drinking the Kool-aid? No – I see signals that indicate this is happening.

Look at the new LinkedIn application widgets that mash-in LinkedIn data. My TripIt tool reminds me of a future trip to London, and tells me that I have a few LinkedIn contacts there. Based on information in my LinkedIn profile, LinkedIn tells me about relevant upcoming events. I found an interesting event in London. Will my contact attend that event? What if they’ll be traveling to Boston when I’m in London. Doh! So close.

But I see the outline of a new pattern. Whereas each data mashup is interesting, the right combination can transform my work behavior. There’s a who, what, when, and where, that all have to intersect onto a map and onto a calendar.

I recently met Sanjay Vakil of LuckyCal, who understands this pattern well. He is connecting the dots to create transformative value out of data streams. His product has some growing up to do before it is ready for enterprise rollout. But his product today combines a mapping mashup (telling me which of my contacts are where I am going to be), with a calendar mashup (matching when contacts of mine will be near me), and with an events stream (telling me what other events are taking place there at that time). Hey, that’s the pattern: a set of data streams intersecting to create valuable information out of available information – but onto multiple mashup surfaces.

Suddenly the neurons start to fire. We need more than a single stream of data pins on a map to get our attention. As the LuckyCal product matures, it can become the paradigm for an enterprise mashup - triangulation. If it adds the data streams that matter most to me, (e.g. my CRM data), along with other streams and network information, it results in new information. The triangulation of these data sets means that I could predict whom I meet and what to do when I plan my trips. Moreover, my manager would be able see where the team’s travelers are now, and where they will be in the near future. My sales manager could see which of us will be traveling nearby other clients, and she may want take advantage of the proximity opportunities. Travel still happens, but we can get more value out of each trip. Enterprises like to hear that.

So, if you combine two mashups and couple of data feeds, you can create transformative value from readily available information. I had faith this could be created, but now that I see signals that others are implementing solutions like this. I have renewed faith in the relevance of mashups to enterprise computing. It’s just more complex than splashing a data set onto a map. That’s OK, enterprises are used to leveraging complexity to create value. And mashups can be the building blocks to enable their success.

November 17th, 2008

Is the ‘Green’ in Green IT dead?

Posted by Doug Washburn @ 1:28 pm

Categories: Green IT

Tags: Financial, Information Technology, Green IT, Doug Washburn

In a number of recent client interactions with both enterprise IT end users and vendors, the question of “Is the ‘green’ in Green IT dead?” has come up. Primarily driven by the current economic climate, IT end users want to understand how relevant the environmental benefits of Green IT should be to their strategic planning; likewise, vendors want to know how palatable green messaging of their products and services is to their customers.First and foremost, technology is not green and never will be. The design, manufacture, operation and disposal of IT equipment generates tremendous upfront and ongoing environmental impact (read more about this in my “Is Green IT Your Emperor With No Clothes?” research). A recent – and very primetime – example of this is the 60 Minutes “The Electronic Wasteland” segment. David Berlind from InformationWeek offers a great follow on to this in his “An E-Waste Story That’ll Make You Want To Quit Tech” story.

Secondly, the ecological benefits of Green IT take a backseat to the business benefits – namely cost reduction. In other words, IT leadership’s driving motivation for Green IT is financial, not environmental. This shouldn’t be a surprise. At the end of the day, corporations – even those with the greenest of intentions – make decisions to effectively manage risk, costs and revenues to deliver profits which ultimately drive shareholder value.

While corporate social responsibility and environmental sustainability is on the rise, these practices are being employed to ultimately achieve an economic goal. And a green strategy can be an effective means to this financial end. The Economist Intelligence Unit’s “Doing Good: Business And The Sustainability Challenge” identifies a positive correlation between green efforts and financial performance: “companies that rated their [green] efforts most highly over this time period [the past three years] saw annual profit increases of 16% and share price growth of 45%, whereas those that ranked themselves worst reported growth of 7% and 12% respectively.”

The key takeaway is that Green IT is no different. Because corporate IT operates within the realm of the corporation, financial obligations come first. While Forrester’s own research from April 2008 shows that “doing the right thing for the environment” is a top driver for IT professionals pursuing Green IT, these motivations must also deliver tangible business value – from reducing IT’s energy-related operating expenses to mitigating data center out-of-space or out-of-power concerns. So when setting Green IT strategy – especially in volatile economic times – I suggest IT leadership take a similar approach to Google’s Commitment to Sustainable Computing which explains: “Sustainability is good for the environment, but it makes good business sense too… It is this economic advantage that makes our efforts truly sustainable.”

November 17th, 2008

Addressing virtualization’s achilles heel

Posted by James Staten @ 7:30 am

Categories: Data center, Virtualization

Tags: Bandwidth, Virtualization, Network Interface Card, Blade, Achilles Heel, VM, Networking, Servers, Hardware, James Staten

The benefits of virtualization are quite obvious but when you start to really increase the density of virtual machines in order to maximize utilization suddenly it ain’t such a simple proposition. The latest CPUs from AMD and Intel are more than up to the task of running 10-20 or more applications at a time. Most servers run out of memory and I/O bandwidth well before processing power. Recent announcements from the leading server vendors have been made to address the memory side by packing more DIMMs onto a single motherboard (including blade server boards), but you can only add so many Ethernet cards and Fibre Channel HBAs. Oh yeah, and then there’s the switch ports to go with them (blade systems help a lot here).If you are part of the elite group of infrastructure and operations managers who are pushing the VM density envelope, then 10GbE may be your better option. Most VMs individually don’t consume the full bandwidth of a single GbE NIC but we are quickly seeing that the standard network configuration of ESX is 6 NICs and 2 FC ports per VM. The NICs are for console, vm kernel, and vm network and you need two of each, for redundancy, for a total of 6.  And each of these NIC connections requires a separate data center uplink cable. On top of this, the more VMs you add the more bandwidth is consumed which requires…more ports and that means a lot of connections. And even if each connection only consumes 10% of that 1 GbE of bandwidth each, you’re running out of I/O very quickly. Plus every VM is sharing a limited set of physical NICs - heaven forbid you might actually want to do quality of service or give any of these VMs their own physical NIC, as is often the case.

10GbE can address the NIC sharing scenario and Ethernet storage solutions such as iSCSI and the forthcoming Fibre Channel over Ethernet (FCOE) - yes, I know Cisco says it’s ready today - can save you tremendously on HBA costs. The need for more true physical connections is more of an issue.

The NIC vendors are addressing this scenario with SR-IOV (single-root I/O virtualization) technology that splits 10GbE NICs granulary and dynamically so you can set quality of service parameters for the virtual NICs that share these pipes. But it’s a virtual solution; if you still need physical NICs you’re out of luck.

To address this, HP has released Flex-10 Virtual Connect modules for its c-Class blade systems. These 10GbE switch modules (and this technology in implemented on its 10GbE NICs in the BL495c blade too) can physically split a single 10GbE connection into 4 physically discrete connections with tunable bandwidth (100Mbps increments up to 10Gbps per connection). With Flex-10 modules and BL495c blades each physical server gets 8 “physical” NICs (up to 24 with an expansion cards), which fan out to 384 “physical” connections coming out of a full bank of switch modules. You of course can blow out this number with virtual NICs per VM as not every VM will need its own physical NICs. And each of these connections can replace a FC port in an Ethernet storage configuration. If you want to pack a ton of VMs into a tiny package without sacrificing I/O performance this is an intriguing way to go. Even if you don’t use Flex-10 for storage, the density benefits here are worth considering.

As we stated in our report on 10GbE futures, earlier this year, the move to 10 is a pricey upgrade today but more easily justified in IT infrastructure consolidation moves since so much more consolidation can be achieved. Blade servers and even VMware constantly face similar price justification challenges but are winning more and more customers through this same cost analysis. You’ll have to include the switch upgrades in your analysis but if you can achieve 2x or greater consolidation in doing so, the investment may be well worth it.

November 3rd, 2008

Can Force.com help consolidate your web tier?

Posted by James Staten @ 12:34 pm

Categories: Data center, Cloud computing

Tags: Salesforce.com Inc., Web, Integration, Force.com, Sales Force Management, Channel Management, Sales, Marketing, James Staten

At Dreamforce today, here in San Francisco, Salesforce.com announced a significant, and seemingly long overdue, enhancement to its SaaS offering. They announced Force.com for Facebook and Force.com for Amazon Web Services that are pre-integrations between their platform and these two other platforms. This new capability lets enterprise customers of their CRM solution (or any other AppExchange or Force.com) provide a public front-end to their instance of these services, directly from these services.  The big deal with these additions is that they let you tie third party applications directly into your Force.com applications. In the case of the AWS integration, if you have applications or services built in Java, the LAMP stack or native C code, you can integrate them with your Force.com apps.

There is no Salesforce.com log-in required and there is no separate web tier integration you have to host and manage. For example, if a manufacturer used Salesforce.com to maintain its price list and order entry, it could simply build a public storefront in VisualForce and this storefront would be served up directly from Force.com. Previously, if you wanted to leverage Salesforce.com data on your web site you had to code integrations between the Salesforce.com application in question and your web apps; and of course carry the expense of hosting, sizing, and optimizing this infrastructure for performance.

We certainly don’t see enterprises migrating their entire web presence to Force.com because of this, but directly providing this capability from the cloud will let enterprises eliminate a portion of their web infrastructure and free their developers from having to maintain and troubleshoot the integration code that made this possible before.

In today’s recessionary climate, anything that helps streamline IT operations and eliminates costs is a good thing.

There are caveats to this, of course. You have to build these web front-ends in Apex or VisualForce, Salesforce.com’s proprietary language and UI builder, respectively, and clients will have to understand the security, process and workflow implications of this change. But if you already have skills in their tools today – which is probably the case if you are part of their installed base – then the transition should pay dividends.

November 1st, 2008

Better late than never: Microsoft Office Web apps percolating

Posted by Sheri McLeish @ 10:57 am

Categories: Uncategorized, Information Workplace

Tags: Desktop, Web, Knowledge Worker, Web Application, Microsoft Corp., Zoho, Channel Management, Microsoft Office, Marketing, Office Suites

When Microsoft announced this week that its next version of Office will include web apps there was no real surprise. But it reminded me of Steven Wright on Dr. Katz when he acknowledged that he usually had four or five cups of coffee before his first cup of coffee. Knowledge workers have started drinking at the web apps cafe, but are just getting warmed up for the real thing. It’s when Microsoft’s brew is ready that it starts to count.

Microsoft has been sloth like to move its apps to the web, but it is coming at it from a position of strength. Zoho has a great buzz and feels like it adds an app a week while for Google it’s an all or nothing bid focused on alternatives to desktop productivity from Office. Microsoft’s taking on wikis and other collaborative needs, blending the experience in the tools. Want collaborative authoring? They have it (by the way, so will Adobe soon). Want a web-based editor? Got it. Want the web-based editor to work with your peers that have the full desktop while you collaborate on a contract or meeting notes? Got it. But bringing it all together without being overly complex or undercutting Microsoft’s Office suite margins takes time.

Microsoft is trying to evolve to keep its dominance in desktop productivity and maintain relevance. Today Microsoft continues to reign supreme in the desktop productivity tools space. The question is whether or not knowledge workers will be satisfied by lighter weight versions of their desktop tools, as Microsoft will be offering only scaled back web versions of its Office programs. Forrester’s data shows the uptake of web productivity apps in the enterprise to be miniscule even if interest is expressed. That’s because no one’s really satisfied with lighter weight versions. If they were a viable alternative to Office, we’d have seen much greater adoption of Google and others. And for enterprises believing they will be able to reduce licensing costs with web-based Office apps, they will likely be disappointed as availability will be through existing volume licensing agreements. Deployment costs should go down, however, and these are material.

So far, time seems to be on Microsoft’s side. Because even though web-based productivity tools exist, no one has done much more yet than mimic Microsoft’s capabilities. And no one has successfully figured out an efficient content collaboration strategy that engages knowledge workers the way they want to work – at the office, on the go, on the web, offline, authenticated or not, or some combination of these depending on a person’s role, location, personal preferences – or what day of the week it is. Choice will reign supreme as the knowledge worker demographic shifts and expectations increase on being able to transition seamlessly between devices and desktops, between wikis and Word. Microsoft recognizes this shift and can leverage its familiar apps to address this gap. But it’s still hurry up and wait for now.

advertisement

Recent Entries

Most Popular Posts

advertisement

Archives

ZDNet Blogs

Dedicated Hosting

advertisement
Click Here