Agile software engineering practices have become the standard work management tool for modern software development teams. Are these techniques applicable to analytics, or is the nature of research prohibitively distinct from the nature of engineering? In this post I am going to explore some of the pros of using a scrum-like work management process in analytics.
Many Data Scientists come from a hard science background - statistics, math, physics. Hard sciences have a bias towards empirical and objective truths: a correct answer exists and we can find it by employing the scientific method, usually manifested by a formulaic approach to solving the problem at hand. While not a controversial statement in itself, many years of studying and application of such a paradigm can collide with the practical realities of the business world. In that world, it becomes increasingly difficult to perfectly apply the theory. As a result, the practitioner should understand how to adjust their model and their approach accordingly.
Data warehouses are not just for business intelligence (BI) anymore. You can maximize the value of your data engineering, data science, and analytics work by investing in building out a multi-use data-platform that serves business users, Analysts, Statisticians, and intelligent applications. In my last post, data-dies-in-darkness, I described how you can improve your organization’s data quality by exposing more data to more people. You can stretch this idea even farther by expanding the stakeholders of your data warehouse to include intelligent applications.
I love doing reporting. Well I don’t actually love doing reporting, but I love what it can do for an Analytics team. If executed well, reporting can be the gateway drug, resulting in an organization that is completely addicted to its Analytics team. If executed poorly, the Analytics team can turn into a team of reporting monkeys - we all know what that is like. Here is some advice on how to use reporting as a means to create strong stakeholder relationships in your organization.
The fastest way to doom an Analytics team (and any hope of building a data-driven organization) is to present data and analyses that are often flawed or inconsistent. When people don’t believe they can trust the data, they will stop using them (and, if you are an analytics leader, you might be soon looking for a new job).
Attribution is a tough challenge that is top of mind for every Marketing and Analytics leader. While marketing strategies and technologies may have evolved, the most important question has not changed - Is our marketing working? The right attribution solution should help you answer that question. But how do you find the right solution? Unfortunately, there is no turnkey attribution solution that perfectly solves all of your measurement challenges. Each business has unique attribution challenges and there are a seemingly infinite number of vendors and methodologies. As a result, I created a framework to navigate the increasingly complex multi-touch attribution market, understand the trade offs between solutions and identify the optimal attribution solution.
People often ask for advice about building out an analytics organization – How to structure the team? What skills to hire for? Do we need engineers? What about data scientists? How big should the team be? Unfortunately, there is no easy answer to these questions, because the best analytics team is the one that best supports the organization and its specific needs. To make things even more complicated, A) different organizations have very different needs and B) your organization’s needs today will be very different from its needs in the future. In this post I will discuss some of the different dimensions that are import to evaluate when thinking about how to structure an Analytics team.