My phone thinks I’m a prostitute (and it’s actually pretty close to the truth).

Share on facebook
Share on twitter
Share on linkedin
Share on email

Data is at the heart of everything we do – but it’s not always enough. We need the human touch too. We need it to reach out to the people using our app so that we can learn more about them. And we need it when we’re delivering them the Insights that will ultimately change their health.

Everyone at HeadUp is working remotely. Except for me. I still show up at the office each morning, turn on the lights and get on with my day. It’s just me so there’s no risk of Covid19 transmission. The HeadUp Melbourne office is in South Melbourne. An increasingly gentrified city fringe area that’s full of brand storytellers, trendy cafes, skinny jeans, man buns, full sleeve tattoos, beard wax, converted warehouses and artisanal, organic, non-GM, locally grown everything.

If you visit the newer, creative industrial hub of your city (where design, advertising, media & digital agencies and architects tend to dwell), you’ll probably find it’s in a place that’s also dotted with a few businesses left over from when the neighbourhood was more colourful. Businesses like brothels.

Every morning as I set off for my office, my iPhone delivers me this message. “You are 16 minutes away from work at Gotham City, South Melbourne”. Gotham City is an oddly themed, upmarket, legal “massage parlour” in the inner city of Melbourne. It also happens to be across the road from HeadUp.

The estimate of my commute is on one level, eerily accurate. From a satellite in space, my phone has managed to figure out where I go each day to within about 50 feet. It’s helpful too, because its intent is really useful – it ensures I get to work safely and on time.

But it is just slightly off – which has led to a pretty incorrect conclusion about what it is I do for work. This is a perfect example of one of the perils of big data. And a good segue to sharing some of what we have come to learn at HeadUp about using data to get closer to people and better understand and serve them.

When personal data are inaccurate and incomplete? Fill the gaps.

Apple’s assumption about my workplace is a funny example of how data collected about us can be so close, yet ultimately, wrong. This is a problem when the purpose is to build trust with me or if the data is informing other decisions about me.

Yet, if my phone asked me tomorrow to confirm whether I worked at Gotham City or to nominate my place of employment, then this would provide Apple with more accurate and useful data AND build a greater sense of intimacy and trust with me.

One of the unique things about our approach here at HeadUp is that we make sure people are engaged enough so that we can collect self-reported data. In the context of health, it’s often assumed that self-reported data is unreliable. Generally, we’ve found the opposite to be true. By asking people to confirm or clarify existing data, we’re able to train our models and simultaneously build trust.

If we want to keep asking questions and collecting more data, we have to build trust and intimacy, too. Every data point we collect and every Insight we deliver back to a person has to move us a step closer to them so that they share more data.

We work hard to get this right. That might sound unfashionable for a data science company to admit, but it’s true. We don’t sit back and let AI and ML do all the work. Without data, we know nothing about people. But with data, we then have the challenge of sorting through it and organizing it in such a way that we don’t fall into the trap of over confidence.

Here’s how we do it.

We are able to ingest enough data to quite accurately predict whether an individual may be obese or not. But we still ask them to measure their waist circumference. And even once we have that, we ask them what size jeans they normally buy. This is all done in a sequence so that the questions are contextual and relevant and never feel invasive.  

By clarifying and confirming people’s data, we’re able to overtly demonstrate that we are paying attention (which reinforces trust) and simultaneously correlate all their data points to ensure that we have up to date information. This human factor is often overlooked in the field of data science.

Once you’ve earned permission from someone to collect information about them, there’s then an opportunity and in fact an obligation to involve them in the process. If Apple asked me if I’d like directions or up to date traffic reports on my commute and to confirm the address of my business, then I’d willingly tell them. Because I can see that they’re asking these questions to help me get to where I need to go. And logically, how can they get it right if I don’t help them along?

We ask sensitive questions, too – but asking someone straight up if they smoke, for example, can be a pretty unreliable method of obtaining an answer. Whereas earning trust first and establishing that the question is being asked in order to help them improve their life and take better care of themselves is guaranteed to improve the odds of obtaining accurate self-reported data. These odds are improved much more again, when the question is presented using existing data that has already been collected.

“Glenn, you have a low Vo2 Max, which you may recall is a really accurate marker for your health. Having a red rating for your Vo2 Max isn’t something to ignore. It’s important that we focus on this together and get you into the green zone. For many people, the way to improve it is to get in shape and reduce their waist circumference and weight. But, good news, you’re in the healthy range for both these things. So, we need to dig a little deeper to make sense of this to help you figure it out. Are you a smoker?”

Tech needs the human touch.

When presented with their own data in this way, people are more likely to be receptive of further questions. They’re more likely to want to be actively involved in their personal data building. This creates trust and intimacy and reduces the likelihood of us having incorrect data. In turn, we can serve them better with more detailed and highly personalised Insights. And in turn, this builds more trust and intimacy.

By demonstrating a ballpark knowledge of an individual, we’re able to earn their respect and gain permission to achieve more precise knowledge.

Phone+Headup.jpg

My late father, Ronnie, once told me, “Near enough isn’t good enough. Unless you’re trying to kill someone with a hand grenade”. In data science, near enough can be good enough, provided it’s presented to people in the right way. It can be a valuable way to demonstrate that you’re paying attention and have the capability to help.

But data by itself can’t get you all the way. That’s why we employ people from both the science and humanities faculties. The human touch matters.

I guess the business across the road already knows that. Boom.

More to explore

Say it with Orange

It’s October, and that usually means one thing – a swathe of pink for Breast Cancer Awareness Month. We all understand what

Be your Breast friend

Be your own breast friend

Self check and stay informed. Breast cancer is the most common cancer in women in Australia (apart from non-melanoma skin cancer) and the second most common cancer to cause death in women, after lung cancer.