Category Archives: B2B Data Exchange
The term “big data” has been bandied around so much in recent months that arguably, it’s lost a lot of meaning in the IT industry. Typically, IT teams have heard the phrase, and know they need to be doing something, but that something isn’t being done. As IDC pointed out last year, there is a concerning shortage of trained big data technology experts, and failure to recognise the implications that not managing big data can have on the business is dangerous. In today’s information economy, as increasingly digital consumers, customers, employees and social networkers we’re handing over more and more personal information for businesses and third parties to collate, manage and analyse. On top of the growth in digital data, emerging trends such as cloud computing are having a huge impact on the amount of information businesses are required to handle and store on behalf of their customers. Furthermore, it’s not just the amount of information that’s spiralling out of control: it’s also the way in which it is structured and used. There has been a dramatic rise in the amount of unstructured data, such as photos, videos and social media, which presents businesses with new challenges as to how to collate, handle and analyse it. As a result, information is growing exponentially. Experts now predict a staggering 4300% increase in annual data generation by 2020. Unless businesses put policies in place to manage this wealth of information, it will become worthless, and due to the often extortionate costs to store the data, it will instead end up having a huge impact on the business’ bottom line. Maxed out data centres Many businesses have limited resource to invest in physical servers and storage and so are increasingly looking to data centres to store their information in. As a result, data centres across Europe are quickly filling up. Due to European data retention regulations, which dictate that information is generally stored for longer periods than in other regions such as the US, businesses across Europe have to wait a very long time to archive their data. For instance, under EU law, telecommunications service and network providers are obliged to retain certain categories of data for a specific period of time (typically between six months and two years) and to make that information available to law enforcement where needed. With this in mind, it’s no surprise that investment in high performance storage capacity has become a key priority for many. Time for a clear out So how can organisations deal with these storage issues? They can upgrade or replace their servers, parting with lots of capital expenditure to bring in more power or more memory for Central Processing Units (CPUs). An alternative solution would be to “spring clean” their information. Smart partitioning allows businesses to spend just one tenth of the amount required to purchase new servers and storage capacity, and actually refocus how they’re organising their information. With smart partitioning capabilities, businesses can get all the benefits of archiving the information that’s not necessarily eligible for archiving (due to EU retention regulations). Furthermore, application retirement frees up floor space, drives the modernisation initiative, allows mainframe systems and older platforms to be replaced and legacy data to be migrated to virtual archives. Before IT professionals go out and buy big data systems, they need to spring clean their information and make room for big data. Poor economic conditions across Europe have stifled innovation for a lot of organisations, as they have been forced to focus on staying alive rather than putting investment into R&D to help improve operational efficiencies. They are, therefore, looking for ways to squeeze more out of their already shrinking budgets. The likes of smart partitioning and application retirement offer businesses a real solution to the growing big data conundrum. So maybe it’s time you got your feather duster out, and gave your information a good clean out this spring?
If you haven’t updated your B2B integration capabilities in the past five years, are you at risk of being left behind? This is the age of superior customer experience and rapid time-to-value so speedy customer on-boarding and support of specialized integration services means the difference between winning and losing business. A health check starts with asking some simple questions: (more…)
Bring The Outside In: Why Integrating External Data Sources Should Be Your Next Data integration Project
We recently had Ted Friedman, VP Distinguished Analyst, from featured analyst firm Gartner speak about what companies can do to extend their internal data integration strategies to included integrating external data sources. If the next generation of your DI projects includes inter-enterprise data sources, you’re in good company. He mentioned 28% of the time, data integration tools are being used for inter-enterprise integration. That is almost the same rate and Master Data Management (MDM) and Data Services use cases. Ted is also predicted the inter-enterprise use case will continue to grow as more integration projects will include data outside the firewall. (more…)
In a recent Aberdeen research, they found that 95% of respondents (of 122 responses) replied on some level of manual processing in order to integration external data sources. Manual processing to integrate external data is time consuming, expensive and error prone so why do so many do it? Well, they often have little choice. If you look deeper, most of the time these data exchanges are with small partners and small partner enablement is a significant challenge for most organizations. For the most part, (more…)
The other week we had Ayman Taha, Director of IT for Enterprise Solutions Integration, from Avent in to talk about how he automated the processing of unstructured, non-traditional B2B exchanges with external partners. Avent is a large distributor of electronic components with a diverse ecosystem of customer and suppliers. Their B2B infrastructure reflects the complexity of their environment but, despite a sophisticated and mature EDI infrastructure, they were still manually re-keying invoices and product updates from hundreds of spreadsheets and .PDFs received from partners. This was because many of their smaller customers and partners could not send or receive EDI messages. (more…)
One major emergence from the Big Data debate especially in the Telco Industry is the sudden elevation of the focus on Customer Experience with QoE or Quality of Experience and CEM or Customer Experience management. With new and emerging technologies such as Near Field Communication (NFC), Machine to Machine (M2M) and Mobile Social Media Apps hitting the news every day like a Reality ‘Stars’ socializing antics; we are all fascinated by how much organisations either know, can find out or deduce about our lives: what we like / dislike, how much we may be worth to those organisations, what we already own and even where we physically are or will be in the next few minutes. All the minutiae of our lives and personalities laid bare to be pawed over, analysed and used to control us and eventually sell us yet more ‘stuff’. (more…)
For once hype could be a good thing. Well it is if you’re reading the latest Gartner – Hype Cycle for Application Infrastructure published last month, in July; because in it you will see how two important technology trends – the areas of BGS (sorry, another TLA for you to learn – B2B Gateway Software (and it’s even a nested TLA!)), and Managed File Transfer (MFT) have now made it out of the Trough of Disillusionment and up onto the Slope of Enlightenment. Why do I suddenly feel like John Bunyan’s Pilgrim?
Anyway, the key points that Gartner identifies are that centrally managing B2B interactions provides:
- Economies of scale and deeper insight into the technical aspects of data integration, transaction delivery, process integration and SOA interoperability, such as consolidating, tracking, storing and auditing files, messages, process events, acknowledgments, receipts, and errors and exceptions.
The B2B communications process with your external business partners, suppliers, etc. is not a static process and you need to be able to have visibility of these communications for not only regulatory compliance and auditability issues but also to manage the dynamic process.
- A single point through which to troubleshoot B2B integration issues.
B2B Gateways are now a mature technology and like standards most organisations use a number of them. Integrating them provides significant benefit and enables the organisation to have visibility of their business relationships and transactions and also know where to go to when things go wrong and need managing as they definitely will.
- A central, reusable repository for external business partner profiles and Web services APIs. This is particularly valuable when dealing with a large number of external business partners and cloud APIs, and when multiple business units interact with the same partners or cloud services.
The number of business partners we all have to deal with is increasing rapidly as we outsource, subcontract, farm-out and generally rely more on external specialist organisations. Having visibility of these relationships and making the most from new integration methodologies and processes can generate great savings and also give visibility of our business exposure to these suppliers.
- Support for the myriad data formats, transport and communication protocols, and security standards.
As the old saying goes “I love standards, there are so many to choose from.” Well our business processes are not getting any simpler; data standards are under constant change and revision with data formats becoming increasingly complex. So to be able to handle not just a few but all key formats and to be able to reuse previous transformation experience, utilise already developed libraries and lever new complex hierarchical data structures makes the difference between a stove-piped and soon to be redundant system and one that is flexible and supports new and ever changing business requirements.
As one of the vendors identified that can actively compete with offerings positioned to address the broader set of usage scenarios, Informatica’s B2B Data Exchange solution not only supports the B2B functional requirements an organisation will have but also integrates this process and data into the wider internal data integration platform and management process. (Look at this presentation for the new 9.5 functionality of key new featues.)
So now the reality not hype of B2B solutions can be delivered.
Sources: Gartner Hype Cycle for Application Infrastructure, 2012. Published: 24 July 2012
Analyst: Jess Thompson
Gartner Disclaimer re the Hype Cycle
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Why have B2B solutions been defined as outside the firewall? Increasingly we are seeing organisations that have evolved, or are evolving into amorphous structures that are hard to define as either a single or multiple entities. Organisations that have traditionally been compartmentalized as in a specific industry and offering a specific service are frequently now focusing on their core abilities and organizational strengths and applying this to service other departments, subsidiaries or even other organisations in totally separate industries – areas of expertise such as Payments processing for Financial Service companies or Bill processing by Telecommunications operators are seen as revenue generators not corporate overheads. We also see supposedly key processes being outsourced such as Call centers and Networks. Organisations are diversifying hoping that owning the customer or consumer will be enough to be able to resell associated products or services that they have branded or can acquire from other departments or suppliers. It’s a long time since organisations have had to own all their product and service creation and delivery functions however successful and capable they are; but they do have to maximize the revenue and benefits they receive from them.
All in all the hairball of inter-relationships within and without the organisation is becoming more and more convoluted. The traditional supply chain and its management that typically was seen as the life blood of industries such as manufacturing and retail has increasingly been absorbed by industries and sectors as diverse as Financial Services, Telecommunications and the Public Sector who are reliant on partner organisations for key parts of their product and service creation, delivery and support.
But this evolution throws up significant issues as well as benefits.
A major issue is the management and control of access to data and security compliance. Visible security management, access control and auditability are prerequisites of any customer data integration solution but frequently data and access from partners and from within an organisation are viewed as separate processes.
The ability to swiftly respond to changing business and market requirements means not only managing new partnerships and data flows but that the organisation or department providing you will core services last week may not be the same as next week.
This all means that the traditional B2B data flows can now be rethought. The benefits of B2B solutions with partner on-boarding processes and management; data format transformations and managed file transfer are just as relevant within an organisation and its departments as well as when connecting external partner organisations.
The ability to link and manage data publishing organisations / systems / applications together with those applications within your organisation or department that consumes them is just as relevant within the firewall as from outside.
And if you can integrate the external organisation and internal departments data then you are definitely on the road to solving the problem of business change, data security, regulatory compliance and maximising the value of your most important asset – data.
Today, Informatica is announcing the immediate availability of Informatica HParser, the first enterprise-class data parsing transformation solution for Hadoop environments. Available in a free community edition and commercial editions, Informatica HParser empowers organizations to maximize their Return on Data by extracting the value of complex, unstructured data traditionally under-exploited in the enterprise. Please view how Ronen Schwartz, Vice President of Products, B2B Data Exchange and Data Transformation explains what drove Informatica to build and release HParser. Why We Built HParser.
To understand why this is important to the Hadoop community, let’s look at how organizations are using Hadoop today. In 2011, Ventana Research completed a benchmark research survey among 163 large scale data users. (more…)