3 Best Practices for Bridging the Gap Between Engineers and Analysts

As an Analytics engineer, one of the most challenging problems I face is bridging the gap between engineering and analytics. Engineering and analytics are often siloed into their own teams, making cross-collaboration quite difficult.
Engineering pushes software and data changes that analytics knows nothing about. Analytics is then forced to pivot its work to accommodate these changes. Or worse, analytics must suggest a change that engineering needs to then fight into their tight schedule.
If the gap between the teams becomes too wide, it adds a lot of time, struggle, and data quality issues to the already piling amount of work. This is why it's imperative to implement best practices for communicating with one another from the moment this issue is spotted.
In this article, I'll discuss some of the most common problems I've faced and the best practices you can apply to solve them.
A lack of ownership of code and tests
Often there is a lack of ownership of logic built out in code, especially between analytics and engineering. Who owns the code that generates the data used by analytics daily? What if there is logic built into software code that generates a key field used in reporting?
Ownership of code can be a tricky thing. However, it's best to add as many safeguards as possible. Any code used by analytics teams should have them assigned as part owners. This way, engineering can check with them before making any critical changes.
Adding code owners on GitHub
I recently discovered the code owners feature on GitHub that allows you to assign an owner to a specific piece of code in your repo. Using this feature protects the code by requiring a review by the code owners before it can be changed.
My team recently implemented this for some logic built into HTML calculating an important field used downstream by a lot of the business. We use the same logic in one of our data models and need to ensure that we are notified if this ever changes.
Adding ourselves as code owners has given us the extra confidence that if this ever changes, we will know about it.

Many teams also lack owners of the tests in their data environment. While having tests to detect data quality issues is a step in the right direction, you need to ensure someone is owning them.
I've seen alerts go off in Slack, only for nobody to end up fixing them. Engineering and analytics teams both think the alert is the responsibility of the other team, therefore holding off on doing anything. Then, the problem continues to persist and even worsen.
Assigning clear owners (whether it's an engineer or data analyst) on a test will leave no room for questioning. Everyone knows exactly who should be working on resolving a failing test.
I highly recommend assigning a data engineer to tests that have failed on raw, source data, an analytics engineer to tests failing on data models, and a data analyst to tests that fail in the BI layer.
Assigning owners to dbt tests
If you use dbt for data transformation and testing, there is a free dbt package called Elementary that allows you to assign owners to your tests.
tests:
- not_null:
meta:
owner: ["@madison.mae"]
I simply tag the Slack handle of the person responsible for the test, causing them to be pinged when it fails. This way the person who needs to be notified is notified, and everyone knows who is working on fixing the failing test.
A lack of someone who understands engineering and business
Part of the reason there is a huge gap between engineering and analytics is due to the language spoken. Engineers don't quite understand the business point-of-view of data analysts and data analysts don't always understand the technical jargon of engineers. To bridge this gap, you may need to hire an analytics engineer.
You can think of an analytics engineer as someone with some Data Engineering skills and some data analyzing skills. Their main focus is on data quality, building and testing reliable data models for data analysts, using the data produced by data engineers.

Because they are both technical and business-minded, they can ask the right kinds of questions to engineers, and communicate what needs to change in the data. In addition, they simplify any technical issues to analysts that may be blocking their work from progressing.
As an analytics engineer myself, I've found that we do a lot of work that wouldn't otherwise get done. This includes documentation, data quality checks, and testing. While these are all very important things, they often get pushed to the back burner as an engineer or analyst.
Analytics engineers are focused solely on ensuring you have the highest-quality data that is easy and ready to use.
Create a flywheel to streamline the collaborative process
Lastly, systems need to be put in place for effective communication between Engineering and analytics. Instead of working as two individual teams, they need to work together. Analytics' work relies directly on engineering, so they should be a large part of their process.
I've recently learned that the best way to do this is through flywheels.
A flywheel is a system that makes the result you want easier and easier to achieve, with less effort.
Nathan Barry explains it best using a graphic of a hand-powered pump and a flywheel.
Flywheels help you to work smarter, not harder. They help you put systems into place that continue to build upon one another, making the subsequent step more frictionless than before.
Flywheels can be applied to pretty much anything in life- household chores, paying the bills, your workout routine, and meal prepping. However, let's talk about how they apply to the collaborative process between analytics and engineering.
When building a flywheel, you need to focus on the outcome and the steps in between. This will help you pinpoint where the disconnect is. What steps are currently being done out of order? How can we move them around so that this process moves faster, with less friction?
For example, maybe things are currently done in this order:
- Engineering considers the different things that need to be tracked in a table.
- They diagram it out and figure out how to build it.
- They push it and tell analytics that it's ready to use.
- Analytics comes back with pieces of data they need, or requested changes.
And the flywheel starts all over again…
To make this faster, we could add analytics to the first step. This way, engineering considers all requests from the beginning, before the table is even built. This will allow them to plan better and reduce the changes that need to be made to the architecture later.
The new steps would look like this:
- Engineering and analytics consider all of the different things that need to be tracked in a table.
- Engineering diagrams it out and figures out how they are going to build it.
- They push it and tell analytics that it's ready to use.
Now this looks much more efficient.
This is a bit oversimplified, but now you better understand how to improve your systems to get better results with less effort.
Conclusion
Bridging the disconnect between analytics and engineering teams doesn't happen overnight. In fact, this is something I've been working on for the last year as an analytics engineer. It takes time and patience. Every step forward is something to be celebrated.
Remember, something that took so long to be the way it currently is, takes time to make real progress. Focus on implementing best practices like these and you will eventually have a team that works seamlessly and can push amazing deliverables.
Good luck!