Category Archives: Professional Services
Organizational Change Management and Business Process Re-engineering was the rage in 1990’s. Much of that thinking still persists today but it is no longer sufficient for the kinds of transformations that organizations need to accomplish on an ongoing basis today. A modern business transformation is data driven, global in nature, crosses functional boundaries, and changes behavior at multiple levels of the organization. To address these needs organizations need to adopt a business-led enterprise-wide planning capability. (more…)
One of THE biggest challenges in companies today is complexity. To be more specific, unnecessary complexity resulting from silo behaviors and piece-meal point solutions. Businesses today are already extremely complex with the challenges of multiple products, multiple channels, global scale, higher customer expectations, and rapid and constant change, so we certainly don’t want to make the IT solutions more complex than they need to be. That said, I’m on the side of NO we don’t need a CSO as this blog recently surveyed its readers. We just need a business architecture practice that does what it’s supposed to. (more…)
Informatica joins new ServiceMax Marketplace – offers rapid, cost effective integration with ERP and Cloud apps for Field Service Automation
To deliver flawless field service, companies often require integration across multiple applications for various work processes. A good example is automatically ordering and shipping parts through an ERP system to arrive ahead of a timely field service visit. Informatica has partnered with ServiceMax, the leading field service automation solution, and subsequently joined the new ServiceMax Marketplace to offer customers integration solutions for many ERP and Cloud applications frequently involved in ServiceMax deployments. Comprised of Cloud Integration Templates built on Informatica Cloud for frequent customer integration “patterns”, these solutions will speed and cost contain the ServiceMax implementation cycle and help customers realize the full potential of their field service initiatives.
Existing members of the ServiceMax Community can see a demo or take advantage of a free 30-day trial that provides full capabilities of Informatica Cloud Integration for ServiceMax with prebuilt connectors to hundreds of 3rd party systems including SAP, Oracle, Salesforce, Netsuite and Workday, powered by the Informatica Vibe virtual data machine for near-universal access to cloud and on-premise data. The Informatica Cloud Integration for Servicemax solution:
- Accelerates ERP integration through prebuilt Cloud templates focused on key work processes and the objects on common between systems as much as 85%
- Synchronizes key master data such as Customer Master, Material Master, Sales Orders, Plant information, Stock history and others
- Enables simplified implementation and customization through easy to use user interfaces
- Eliminates the need for IT intervention during configuration and deployment of ServiceMax integrations.
We look forward to working with ServiceMax through the ServiceMax Marketplace to help joint customers deliver Flawless Service!
Over and over, when talking with people who are starting to learn Data Science, there’s a frustration that comes up: “I don’t know which programming language to start with.”
Moreover, it’s not just programming languages; it’s also software systems like Tableau, SPSS, etc. There is an ever-widening range of tools and programming languages and it’s difficult to know which one to select.
I get it. When I started focusing heavily on data science a few years ago, I reviewed all of the popular programming languages at the time: Python, R, SAS, D3, not to mention a few that in hindsight, really aren’t that great for analytics like Perl, Bash, and Java. I once read a suggestion to use arcane tools like UNIX’s AWK and SED.
There are so many suggestions, so much material, so many options; it becomes difficult to know what to learn first. There’s a mountain of content, and it’s difficult to know where to find the “gold nuggets”; the things to learn that will bring you the high return on time investment.
That’s the crux of the problem. The fact is – time is limited. Learning a new programming language is a large investment in your time, so you need to be strategic about which one you select. To be clear, some languages will yield a very high return on your investment. Other languages are purely auxiliary tools that you might use only a few times per year.
Let me make this easy for you: learn R first. Here’s why:
R is becoming the “lingua franca” of data science
R is becoming the lingua franca for data science. That’s not to say that it’s the only language, or that it’s the best tool for every job. It is, however, the most widely used and it is rising in popularity.
As I’ve noted before, O’Reilly Media conducted a survey in 2014 to understand the tools that data scientists are currently using. They found that R is the most popular programming language (if you exclude SQL as a “proper” programing language).
Looking more broadly, there are other rankings that look at programming language popularity in general. For example, Redmonk measures programming language popularity by examining discussion (on Stack Overflow) and usage (on GitHub). In their latest rankings, R placed 13th, the highest of any statistical programming language. Redmonk also noted that R has been rising in popularity over time.
A similar ranking by TIOBE, which ranks programming languages by the number of search engine searches, indicates a strong year over year rise for R.
Keep in mind that the Redmonk and TIOBE rankings are for all programming languages. When you look at these, R is now ranking among the most popular and most commonly used over all.
It’s often said that 80% of the work in data science is data manipulation. More often than not, you’ll need to spend significant amounts of your time “wrangling” your data; putting it into the shape you want. R has some of the best data management tools you’ll find.
The dplyr package in R makes data manipulation easy. It is the tool I wish I had years ago. When you “chain” the basic dplyr together, you can dramatically simplify your data manipulation workflow.
ggplot2 is one of the best data visualization tools around, as of 2015. What’s great about ggplot2 is that as you learn the syntax, you also learn how to think about data visualization.
I’ve said numerous times, that there is a deep structure to all statistical visualizations. There is a highly structured framework for thinking about and creating all data visualizations. ggplot2 is based on that framework. By learning ggplot2, you will learn how to think about visualizing data.
Finally, there’s machine learning. While I think most beginning data science students should wait to learn machine learning (it is much more important to learn data exploration first), machine learning is an important skill. When data exploration stops yielding insight, you need stronger tools.
When you’re ready to start using (and learning) machine learning, R has some of the best tools and resources.
One of the best, most referenced introductory texts on machine learning, An Introduction to Statistical Learning, teaches machine learning using the R programming language. Additionally, the Stanford Statistical Learning course uses this textbook, and teaches machine learning in R.
Summary: Learn R, and focus your efforts
Once you start to learn R, don’t get “shiny new object” syndrome.
You’re likely to see demonstrations of new techniques and tools. Just look at some of the dazzling data visualizations that people are creating.
Seeing other people create great work (and finding out that they’re using a different tool) might lead you to try something else. Trust me on this: you need to focus. Don’t get “shiny new object” syndrome. You need to be able to devote a few months (or longer) to really diving into one tool.
And as I noted above, you really want to build up your competence in skills across the data science workflow. You need to have solid skills at least in data visualization and data manipulation. You need to be able to do some serious data exploration in R before you start moving on.
Spending 100 hours on R will yield vastly better returns than spending 10 hours on 10 different tools. In the end, your time ROI will be higher by concentrating your efforts. Don’t get distracted by the “latest, sexy new thing.”
Whether you are establishing a new outsourced delivery model for your integration services or getting ready for the next round of contract negotiations with your existing supplier, you need a way to hold the supplier accountable – especially when it is an exclusive arrangement. Here are four key metrics that should be included in the multi-year agreement. (more…)
If your goal is to implement a world class Integration Competency Center (ICC) or COE, the best people you could find to make up the team already work for you. If you don’t currently have technical superstars on your team, you can still have a leading-edge world-class ICC that will “wow” your internal customers every time. You don’t need a world-class team to have a world-class competency center……you need a world-class management system. (more…)
They say people are resistant to change. I disagree. People are resistant to uncertainty. Once people are certain that a change is to their benefit, they will change so fast it will make your head spin. It would be a mistake however to underestimate the challenges of changing an organization from one where integration is a collaboration between two project silos to one where integration is a sustainable strategy with a common infrastructure based on strict standards and shared by everyone. (more…)
A CIO told me “After five years with an integration Center of Excellence, I expect them to be excellent. They aren’t.” But so what? The IT organization has lots of things to focus on. Is integration excellence really essential? (more…)
The cover of the September 10 issue of ComputerWorld caught my attention; the headline was Rebirth of Re-Engineering. I was intrigued how the analysts and pundits would spin Business Process Reengineering since I hadn’t seen the BPR acronym much since it fell out of favor around the turn of the century. As it turns out, the NEW BPR is all about Lean and Agile and is being led by IT. Wow! (more…)