In 2015 I gave a talk at the Gaming Analytics Summit about my experience at Daybreak Games working with the different game teams. Since then, I’ve worked with product teams at Electronic Arts and Twitch and wanted to provide an update to this talk. Most tech companies now have analytics teams, but it’s often challenging to translate data insights into actionable results.
The challenge I faced at Daybreak Games was that I led a central analytics team in the marketing organization that needed to work with game studios located in San Diego and Austin. We had embedded analysts on the product teams, but needed more structure in place to make the studio leads more data-informed. One way we worked to accomplish this goal was to revamp how the analytics team scheduled and ran meetings with product teams.
When redesigning how to set up meetings, we asking the following questions:
- What data should we share with product teams?
- When and how should we share data?
- How do we act on data insights?
To make sure that the analytics team could communicate effectively with the product teams, we needed to decide on a set of product KPIs (key performance indicators) that would be a starting point for discussions. Without first deciding on how to track product performance, and understanding the relationships between different KPIs, you risk meetings getting derailed by questions about what is important to track. At Daybreak Games, we came up with six metrics that we tracked across all of our games, and built automated reports with these KPIs that the studios could access. One concern we did have was sharing too much information about sensitive topics, such as company financials, broadly across the company. Our approach was to make all reporting and data available to the studio leads and provide all members of the game teams with access to game-specific reports.
Once we had decided on how to communicate with product teams, we needed to determine how to engage with teams through meetings. We came up with three approaches that are discussed in more detail in the next section. The goal of these different meetings were to make sure that we were able to educate the product teams about our KPIs, have regularity check-ins to review performance, sync on plans for new features, and make recommendations based on findings from the analytics team. We wanted to provide enough structure so that product teams felt that the meetings provided sufficient details about product performance, rather than defaulting to asking analysts for ad-hoc requests.
The third challenge was getting product teams to respond to data from reports and analysis. Our initial focus was to make automated reports accessible to the studio leads, and let the product teams lead more of the discussion in analytics meetings. The success of this varied by game team, since the leads had different backgrounds and technical expertise, but as a company we did make some headway in becoming a data-informed company. Another approach we took was to work directly with game teams on productizing models proposed by the analytics team. This included improvements to in-game messaging systems, marketing efforts, and in-game marketplaces.
At Electronic Arts and Twitch, I faced a different problem working with product teams, resulting from the scale of these companies. Most product teams had identified a set of KPIs to track, but it was unclear how the different KPIs across teams interacted. If the relationships between metrics are unclear, then meetings might get derailed over discussions about cannibalization of products, or concerns over attribution. To address this concern at Electronic Arts we had to better understand the relationship between the EA Access subscription service and Xbox One game sales, while at Twitch we had to more deeply understand the relationship between web and mobile viewership. Once agreement was reached on how these different KPIs interact, the analytics teams were able to have more effective meetings.
Data Meetings
After revamping our KPIs and reporting at Daybreak Games, the analytics team started hosting the following types of meetings with product teams:
- Analytics 101
- Data Scrum Meetings
- Data Insights Meetings
The first types of meetings we set up were with the leads of the product teams to review our new reporting tools and explain the set of KPIs that we focused on tracking. We then opened up these meetings to everyone at the company and advocated for being more data-informed. The goal of these Analytics 101 meetings was to make sure everyone at the company knew what we were tracking and why. We also hosted analytics office hours where members of the product teams could learn how to get more hands-on with our data.
The second set of meetings we set up were short weekly check-ins we had with the data teams called data scrum meetings. We had a meeting for each of the games, and the attendees included leads across different disciplines including design, engineering, and brand. The goal of these meetings was to track the performance of the title and discuss how updates, such as promotional events impacted the KPIs. At Twitch, we also used these types of meetings to discuss upcoming features, and make sure that we had tracking specs and experiment plans for new features that we were launching. The data scrums were also useful for reviewing ad-hoc requests that the analytics team had looked into our the past week, and sharing the results with the broader team.
We didn’t want to overload ourselves with meetings, so we kept data scrum meetings short, and scheduled data insights meetings as needed, when there
were more substantial results to discuss. The goal with these types of meetings was to discuss the results of the analysis, why it’s important, and discuss how to respond based on the findings. For example, we might want to make a change to one the game’s marketplaces based on an analysis of seasonal item sales. At Electronic Arts we would have similar types of meetings, and usually the deliverable was a PowerPoint deck shared after the meeting. Twitch used a bit of a different approach, where the analytics team would write up a long-form report, email it out to the team, and then set up a meeting based on feedback from the product team. Data insights meetings were useful when A/B experiments had conflicting results, and the product team needed to determine whether or not to fully roll out the feature.
Analytics teams are usually involved in monthly business review meetings as well, but are generally not a good forum for working with product teams. These meetings tend to be focused on providing details to executives and are often focused on product roadmaps rather than explaining past performance.
In order for analysts to work effectively with product teams, it’s important to have sufficient structure in place for teams to be able to communicate effectively about product performance, have the opportunity to share findings as necessary, and be able to follow up with actionable results. At Daybreak Games we approached this by standardizing KPIs and setting up automated reporting, scheduling education meetings to advocate for our new approach, held short, regularly scheduled meetings to check in with teams, and scheduled meetings to discuss in-depth analyses as needed.
Based on my experience at Twitch and Electronic Arts, the main changes I would make to this advice are to encourage more frequent communication through tools such as slack, rather than relying on data scrums meetings, and to share results through written reports rather than decks.
Reposted from my medium blog.