Category Archives: Public Sector
I. The Problem
At a time when governments at all levels are being called upon to once again, “do more with less,” Data Center Consolidation (DCC) has become a hot topic. While the Federal government has called for a reduction of 2,400 Data Centers down to about 1,200 by 2015, what does that mean, and most important, how does government get there? (more…)
As the federal government reported an estimated $115 billion in improper payments in Fiscal Year 2011, the impetus to eliminate and recover these funds continues to mount. State governments also struggle with mounting and often embarrassing improper payments with estimated totals approaching $125 billion. (more…)
Remote Data Collection and Transformation – with Ultra Messaging Cache Option and B2B Data Transformation
Sometimes when I drive past an electronic tollway collection sensor, I wonder about the amount of data it must generate. I’m no expert on such technology, but at a minimum, the RFID sensor has to read the chip in your car, and log the date and time plus your RFID info, and then a camera takes a picture to catch any potential violators. Now multiply that data times the hundreds of thousands of cars that drive such roads every day, times the number of sensors they pass, and I’m quite sure this number exceeds several million messages per day. (more…)
Gartner hosted a webinar on January 10, 2012: Gartner Worldwide IT Spending Forecast. One of the topics covered was industry IT spend for 2012.
In covering that topic they made a point of saying that due to severe flooding in Thailand, they expect storage to become in short supply (as much as a 29% global shortfall) through the end of 2012. It is expected that the price of storage/GB will increase as a result and supplies will fall short of demand. They recommended finding alternatives to purchasing storage to keep costs down. (more…)
If you haven’t already, I think you should read The Forrester Wave™: Data Virtualization, Q1 2012. For several reasons – one, to truly understand the space, and two, to understand the critical capabilities required to be a solution that solves real data integration problems.
At the very outset, let’s clearly define Data Virtualization. Simply put, Data Virtualization is foundational to Data Integration. It enables fast and direct access to the critical data and reports that the business needs and trusts. It is not to be confused with simple, traditional Data Federation. Instead, think of it as a superset which must complement existing data architectures to support BI agility, MDM and SOA. (more…)
In a recent InformationWeek blog, “Big Data A Big Backup Challenge”, George Crump aptly pointed out the problems of backing up big data and outlined some best practices that should be applied to address them, including:
- Identifying which data can be re-derived and therefore doesn’t need to be backed up
- Eliminating redundancy, file de-duplication, and applying data compression
- Using storage tiering and the combination of online disk and tapes to reduce storage cost and optimize performance (more…)
In the past, the term latency has been largely ignored in the IT world, with the exception of network engineers and algorithmic trading experts. But today, there is compelling evidence that latency is an important metric for every business that runs a website, or that deploys Rich Internet Applications (RIAs), because even small delays in presenting data show a clear pattern of pushing customers and readers away.
Interesting data, replicated by multiple sources (including Bing, Google, and Amazon) show that slow-loading pages can cause the viewer to lose focus and potentially even click on something else, possibly never to return.
For instance, on search results, a delay of just .5 second chases away up to 20% of the traffic and revenue. As it says at this O’Reilly Radar post, “delays under half a second impact business metrics”.
Friday August 5, 2011 set new records for trading volume around the world. According to this FT.com story: “The amount of data generated by the day’s trading in US futures and equities alone saw over 130m trades on Friday, generating 950 gigabytes of data, according to Nanex, a market data provider.” In London, “some exchanges with older technology could not cope”. And so Big Data strikes again.
But market data volume has been exploding for months, even years. This is just one more chapter in a long story, illustrating the types of problems that a business could encounter if they neglect their technical infrastructure in the face of data volume growth. (more…)
As part of their cost cutting program, organizations are consolidating data centers and the applications within them. Federal and state agencies in the public sector are among those where IT consolidation and moving applications to the cloud are top priorities as part of an overall goal to increase efficiencies and eliminate costs. In other industries, many consolidations are also under way due to mergers and acquisitions and other cost cutting initiatives. As you plan or undergo a consolidation project, you also need to plan for the retirement of legacy, redundant applications that are left behind.
Upon reflection after returning with the rest of the Ultra Messaging® team from the recently-concluded SIFMA 2011 Financial Services Technology Expo, these memories came flooding back . . .
- Grow your business, not your infrastructure – We shared the latest news about our ultra low latency messaging products and how they help a business grow, including helping with Big Data problems and Risk and Compliance.
- Latest and greatest – Along with EMC and Kaazing, we shared the latest about our synergistic technologies. Watch this Kaazing video to hear Mike Pickett, VP of Product Marketing for Ultra Messaging, discuss the synergies with Kaazing WebSockets and Ultra Messaging for scaling streaming message data out to mobile devices and other web applications. (more…)