Note: This post originally appeared on the blog of my since-shuttered analytics firm: Axiomatic. That said, if you need some analytics work, get in touch

Let's say you have a suite of native apps interacting with a web API and need to know how all those features are getting used?

Once you have that information, what would you do with it?

This case study is for Axiomatic's first client - DyKnow a classroom management software company located right here in Indianapolis.

What follows is a short conversation I had with Alex Billingsley, Product Lead at DyKnow, about how Brass Tacks (our analytics audit service) helped answer those sorts of questions.


What is DyKnow?

DyKnow is classroom management software that shows teachers what students are doing on their devices. Our suite of monitoring features block applications and filter URLs in real time to make the most of 1:1 devices, including Chromebooks, Windows, iPads and Macs.

Who are your customers?

Our target market is large school districts with fairly new technology programs. Our buyers are superintendents, tech directors, and decision making administrators, but our end user we build our product for is the average teacher.

How were you using analytics before? How often were they used?

DyKnow is in the process of transitioning our software solution from one that is traditionally purchased as a perpetual license and installed locally on a school’s network to a solution that is cloud hosted and web delivered. With our legacy solution, which has been around for approximately 10 years, we have relied on direct feedback from our customer base and observation by members of our team to make guesses at how our product is being used in the market. In our next generation product we snapped in the Google Analytics tracking codes and would periodically check in on how our users were using our web application.

What changed with Brass Tacks? How are you using analytics now?

Around the same time we began working with Brass Tacks, DyKnow made the decision to extend our next generation product to include natively installed clients on Mac, Windows, iOS and Chromebook. This decision made it clear that snapping in a Google Analytics tracking code on our web client was only going to tell one small part of the product usage story. Because of the nature of our product we were less interested in if users are going to certain pages since we could pretty much guess which pages people would land on. We were instead interested in which key features are being used and to what extent. Brass Tacks helped coach our team by identifying a way to instrument our software in a way that no matter which client is serving our product we can achieve a deep insight in to how the product is being used.

DyKnow now has a Google Dashboard near the coffee machine at the office to give us realtime insight into how our product is being used. We have created several segments to allow us to watch trends at specific customers as they onboard into the product.

What has been the primary benefit to DyKnow from this? What decisions are being made based on the new data?

The data is still pretty fresh but we have already noticed that the new data can help identify bugs in our product and issues with the deployment. We expect that eventually we will be able to identify what a “healthy” implementation of our software looks like a school which will enable us to be proactive with our customers who may be struggling. Our customer relationship team is beginning to leverage this data in conversations about how an implementation is going and expects that the data will be valuable at renewal time. Our product management office plans to use this data to inform where to focus our development efforts.