Tag Archives: storage
The term “big data” has been bandied around so much in recent months that arguably, it’s lost a lot of meaning in the IT industry. Typically, IT teams have heard the phrase, and know they need to be doing something, but that something isn’t being done. As IDC pointed out last year, there is a concerning shortage of trained big data technology experts, and failure to recognise the implications that not managing big data can have on the business is dangerous. In today’s information economy, as increasingly digital consumers, customers, employees and social networkers we’re handing over more and more personal information for businesses and third parties to collate, manage and analyse. On top of the growth in digital data, emerging trends such as cloud computing are having a huge impact on the amount of information businesses are required to handle and store on behalf of their customers. Furthermore, it’s not just the amount of information that’s spiralling out of control: it’s also the way in which it is structured and used. There has been a dramatic rise in the amount of unstructured data, such as photos, videos and social media, which presents businesses with new challenges as to how to collate, handle and analyse it. As a result, information is growing exponentially. Experts now predict a staggering 4300% increase in annual data generation by 2020. Unless businesses put policies in place to manage this wealth of information, it will become worthless, and due to the often extortionate costs to store the data, it will instead end up having a huge impact on the business’ bottom line. Maxed out data centres Many businesses have limited resource to invest in physical servers and storage and so are increasingly looking to data centres to store their information in. As a result, data centres across Europe are quickly filling up. Due to European data retention regulations, which dictate that information is generally stored for longer periods than in other regions such as the US, businesses across Europe have to wait a very long time to archive their data. For instance, under EU law, telecommunications service and network providers are obliged to retain certain categories of data for a specific period of time (typically between six months and two years) and to make that information available to law enforcement where needed. With this in mind, it’s no surprise that investment in high performance storage capacity has become a key priority for many. Time for a clear out So how can organisations deal with these storage issues? They can upgrade or replace their servers, parting with lots of capital expenditure to bring in more power or more memory for Central Processing Units (CPUs). An alternative solution would be to “spring clean” their information. Smart partitioning allows businesses to spend just one tenth of the amount required to purchase new servers and storage capacity, and actually refocus how they’re organising their information. With smart partitioning capabilities, businesses can get all the benefits of archiving the information that’s not necessarily eligible for archiving (due to EU retention regulations). Furthermore, application retirement frees up floor space, drives the modernisation initiative, allows mainframe systems and older platforms to be replaced and legacy data to be migrated to virtual archives. Before IT professionals go out and buy big data systems, they need to spring clean their information and make room for big data. Poor economic conditions across Europe have stifled innovation for a lot of organisations, as they have been forced to focus on staying alive rather than putting investment into R&D to help improve operational efficiencies. They are, therefore, looking for ways to squeeze more out of their already shrinking budgets. The likes of smart partitioning and application retirement offer businesses a real solution to the growing big data conundrum. So maybe it’s time you got your feather duster out, and gave your information a good clean out this spring?
Gartner hosted a webinar on January 10, 2012: Gartner Worldwide IT Spending Forecast. One of the topics covered was industry IT spend for 2012.
In covering that topic they made a point of saying that due to severe flooding in Thailand, they expect storage to become in short supply (as much as a 29% global shortfall) through the end of 2012. It is expected that the price of storage/GB will increase as a result and supplies will fall short of demand. They recommended finding alternatives to purchasing storage to keep costs down. (more…)
Richard Cramer, Chief Healthcare Strategist at Informatica talks about protecting healthcare data in non-production testing environments.
Informatica has been involved in many high value application consolidation projects and naturally our perspective is all about the data – we are interested in its quality, its lineage, its size and shape and how it correlates from one application to another. We provide a range of services to help our customers with this. This is all very important but we also need to think about the larger picture: the application servers, the database servers, shared infrastructure and storage.
Automated [IT infrastructure] discovery is a kind of holy grail – the idea is that you detect all of the servers and inter-server communications happening within a data centre, and somehow you can auto-magically infer business applications and services and all their dependencies. As is so often the case the reality is somewhat more complex – what you actually need to do is then apply lots of filters and human refinements to remove a vast amount of noise so that you end up with a useful and usable model.
I spent last week at EMCWorld, and most of my time was spent engaging with customers in a variety of ways. One thing I always find interesting is the amazing consistency of priorities across our global customers.
For example, we held a session for executives, and I simply asked the open-ended question “regardless of whether it involves our products or not, what is your top IT priority this year.” The answer was clear, overwhelming and simple, yet also rather surprising.
But before I tell you the answer, I want to tell you why it was surprising to me. I keep up with the CIO surveys and the trends and buzzwords. This particular trend seems to be invisible in the media hype and yet this group of CIO’s and senior execs were almost in unanimous agreement that it was at the top of their priority list. (more…)
Over the next few months on the Perspectives blog, I would like to address the many facets of what makes a data migration project work, what makes it successful (and what defines success, for that matter). I am approaching this from the perspective of best practices, including processes, skills and, of course, tools. As a jumpstart, this post and the next five will cover ‘The Five Pitfalls of Data Migration’. These are five key areas that anyone starting a data migration, or in the midst of one, must consider.
First things first. What do I mean when I say data migration? Generally there are two types of data migration, the storage data migration to address a server or database upgrade, requiring little or no data transformation. The application data migration addresses an application replacement, upgrade or consolidation, and requires a significant amount of data transformation. You may be moving to a new version of your CRM application, or consolidating a single view of customer database following an acquisition. Your organization may be embarking on a modernization program, and moving from legacy bespoke applications to a new off the shelf solution. Whatever the driver, there are good sound approaches to include when planning your strategy. (more…)