Category Archives: Public Sector
What is a 360° View of the Citizen? Not a new question, but perhaps one that is not been completely understood by government. Is the 360° View just the latest buzz word or pipedream, or a real solution for governments that can drive new levels of customer service, while also addressing some of the greatest challenges facing governments today including the dramatic requirement to reduce costs, improve service delivery, decrease error rates, and impact positive outcomes? Given the siloed nature of government, a 360° View may seem elusive at best or incompletely unrealistic to some. But, some forward-thinking governments are already well down the path of achieving a 360° View of the Citizen and using the power of this approach to improve customer service, meet the increasing demand for transparency, reducing improper payments, waste, fraud, and abuse, impact better program outcomes, and drive positive policy changes. (more…)
Government organizations continue to face increasing pressure to improve customer service and operational efficiency. To date, almost every organization has embarked on some type of program ranging from 311 call centers to CRM projects in an effort to be more responsive. However, these initiatives may be only scratching the surface of what is needed to achieve real improvements across government. (more…)
I. The Problem
At a time when governments at all levels are being called upon to once again, “do more with less,” Data Center Consolidation (DCC) has become a hot topic. While the Federal government has called for a reduction of 2,400 Data Centers down to about 1,200 by 2015, what does that mean, and most important, how does government get there? (more…)
As the federal government reported an estimated $115 billion in improper payments in Fiscal Year 2011, the impetus to eliminate and recover these funds continues to mount. State governments also struggle with mounting and often embarrassing improper payments with estimated totals approaching $125 billion. (more…)
Remote Data Collection and Transformation – with Ultra Messaging Cache Option and B2B Data Transformation
Sometimes when I drive past an electronic tollway collection sensor, I wonder about the amount of data it must generate. I’m no expert on such technology, but at a minimum, the RFID sensor has to read the chip in your car, and log the date and time plus your RFID info, and then a camera takes a picture to catch any potential violators. Now multiply that data times the hundreds of thousands of cars that drive such roads every day, times the number of sensors they pass, and I’m quite sure this number exceeds several million messages per day. (more…)
Gartner hosted a webinar on January 10, 2012: Gartner Worldwide IT Spending Forecast. One of the topics covered was industry IT spend for 2012.
In covering that topic they made a point of saying that due to severe flooding in Thailand, they expect storage to become in short supply (as much as a 29% global shortfall) through the end of 2012. It is expected that the price of storage/GB will increase as a result and supplies will fall short of demand. They recommended finding alternatives to purchasing storage to keep costs down. (more…)
If you haven’t already, I think you should read The Forrester Wave™: Data Virtualization, Q1 2012. For several reasons – one, to truly understand the space, and two, to understand the critical capabilities required to be a solution that solves real data integration problems.
At the very outset, let’s clearly define Data Virtualization. Simply put, Data Virtualization is foundational to Data Integration. It enables fast and direct access to the critical data and reports that the business needs and trusts. It is not to be confused with simple, traditional Data Federation. Instead, think of it as a superset which must complement existing data architectures to support BI agility, MDM and SOA. (more…)
In a recent InformationWeek blog, “Big Data A Big Backup Challenge”, George Crump aptly pointed out the problems of backing up big data and outlined some best practices that should be applied to address them, including:
- Identifying which data can be re-derived and therefore doesn’t need to be backed up
- Eliminating redundancy, file de-duplication, and applying data compression
- Using storage tiering and the combination of online disk and tapes to reduce storage cost and optimize performance (more…)
In the past, the term latency has been largely ignored in the IT world, with the exception of network engineers and algorithmic trading experts. But today, there is compelling evidence that latency is an important metric for every business that runs a website, or that deploys Rich Internet Applications (RIAs), because even small delays in presenting data show a clear pattern of pushing customers and readers away.
Interesting data, replicated by multiple sources (including Bing, Google, and Amazon) show that slow-loading pages can cause the viewer to lose focus and potentially even click on something else, possibly never to return.
For instance, on search results, a delay of just .5 second chases away up to 20% of the traffic and revenue. As it says at this O’Reilly Radar post, “delays under half a second impact business metrics”.
Friday August 5, 2011 set new records for trading volume around the world. According to this FT.com story: “The amount of data generated by the day’s trading in US futures and equities alone saw over 130m trades on Friday, generating 950 gigabytes of data, according to Nanex, a market data provider.” In London, “some exchanges with older technology could not cope”. And so Big Data strikes again.
But market data volume has been exploding for months, even years. This is just one more chapter in a long story, illustrating the types of problems that a business could encounter if they neglect their technical infrastructure in the face of data volume growth. (more…)