Category Archives: Telecommunications
Most application owners know that as data volumes accumulate, application performance can take a major hit if the underlying infrastructure is not aligned to keep up with demand. The problem is that constantly adding hardware to manage data growth can get costly – stealing budgets away from needed innovation and modernization initiatives.
Join Julie Lockner as she reviews the Cox Communications case study on how they were able to solve an application performance problem caused by too much data with the hardware they already had by using Informatica Data Archive with Smart Partitioning. Source: TechValidate. TVID: 3A9-97F-577
One major emergence from the Big Data debate especially in the Telco Industry is the sudden elevation of the focus on Customer Experience with QoE or Quality of Experience and CEM or Customer Experience management. With new and emerging technologies such as Near Field Communication (NFC), Machine to Machine (M2M) and Mobile Social Media Apps hitting the news every day like a Reality ‘Stars’ socializing antics; we are all fascinated by how much organisations either know, can find out or deduce about our lives: what we like / dislike, how much we may be worth to those organisations, what we already own and even where we physically are or will be in the next few minutes. All the minutiae of our lives and personalities laid bare to be pawed over, analysed and used to control us and eventually sell us yet more ‘stuff’. (more…)
Remote Data Collection and Transformation – with Ultra Messaging Cache Option and B2B Data Transformation
Sometimes when I drive past an electronic tollway collection sensor, I wonder about the amount of data it must generate. I’m no expert on such technology, but at a minimum, the RFID sensor has to read the chip in your car, and log the date and time plus your RFID info, and then a camera takes a picture to catch any potential violators. Now multiply that data times the hundreds of thousands of cars that drive such roads every day, times the number of sensors they pass, and I’m quite sure this number exceeds several million messages per day. (more…)
Gartner hosted a webinar on January 10, 2012: Gartner Worldwide IT Spending Forecast. One of the topics covered was industry IT spend for 2012.
In covering that topic they made a point of saying that due to severe flooding in Thailand, they expect storage to become in short supply (as much as a 29% global shortfall) through the end of 2012. It is expected that the price of storage/GB will increase as a result and supplies will fall short of demand. They recommended finding alternatives to purchasing storage to keep costs down. (more…)
If you haven’t already, I think you should read The Forrester Wave™: Data Virtualization, Q1 2012. For several reasons – one, to truly understand the space, and two, to understand the critical capabilities required to be a solution that solves real data integration problems.
At the very outset, let’s clearly define Data Virtualization. Simply put, Data Virtualization is foundational to Data Integration. It enables fast and direct access to the critical data and reports that the business needs and trusts. It is not to be confused with simple, traditional Data Federation. Instead, think of it as a superset which must complement existing data architectures to support BI agility, MDM and SOA. (more…)
Similar to the way that a carburetor restrictor plate prevents NASCAR race cars from going as fast as possible by restricting maximum airflow, inefficient messaging middleware prevents IT organizations from processing vital business data as fast as possible.
Most Industry analysts agree that there will be a dramatic increase in the number and nature of ‘connected devices’ over the next 3-5 years. These are predicted to run into tens of billions and include communication-enabled consumer devices, machines that move around (vehicles), things that don’t (vending machines, street furniture) and a myriad of sensors, cameras and other intelligent devices.
And you know where there are things that are measuring stuff there will be data; enormous amounts of data; and people and systems wanting to integrate this data and analyse it. (more…)
In a recent InformationWeek blog, “Big Data A Big Backup Challenge”, George Crump aptly pointed out the problems of backing up big data and outlined some best practices that should be applied to address them, including:
- Identifying which data can be re-derived and therefore doesn’t need to be backed up
- Eliminating redundancy, file de-duplication, and applying data compression
- Using storage tiering and the combination of online disk and tapes to reduce storage cost and optimize performance (more…)
In the past, the term latency has been largely ignored in the IT world, with the exception of network engineers and algorithmic trading experts. But today, there is compelling evidence that latency is an important metric for every business that runs a website, or that deploys Rich Internet Applications (RIAs), because even small delays in presenting data show a clear pattern of pushing customers and readers away.
Interesting data, replicated by multiple sources (including Bing, Google, and Amazon) show that slow-loading pages can cause the viewer to lose focus and potentially even click on something else, possibly never to return.
For instance, on search results, a delay of just .5 second chases away up to 20% of the traffic and revenue. As it says at this O’Reilly Radar post, “delays under half a second impact business metrics”.
The phrase ‘Data Tsunami’ has been used by numerous authors in the last few months and it’s difficult to find another suitable analogy because what’s approaching is of such an increased order of magnitude that the IT industries continued expectations for data growth will be swamped in the next few years.
However impressive a spectacle a Tsunami is, it still wreaks havoc to those who are unprepared or believe they can tread water and simply float to the surface when the trouble has passed.