The COVID-19 pandemic saw a drastic increase in the global dependency on digital technology, yet some types of human-computer interactions (HCI) require more human focus to succeed.
In the Middle East, consumers are increasingly using the internet to buy food, meet each other, and learn in digital spaces. But one industry isn’t faring so well online.
The regional apparel industry fell sharply since early 2020 according to McKinsey & Company, which identified that consumers shopping online for customised items, such as clothing and beauty products, need more personal attention.
Dubai-based software provider GetBEE, says it has created the first personalised network for brands to engage with customers on a more human, and immersive level.
Shoppers can view, rate, and receive face-to-face guidance through a video call from their chosen salesperson. Within the call, users can receive direct instructions on how to take their measurements properly and receive a second opinion on items in their virtual shopping cart.
Thea Myhrvold, GetBEE’s founder and CEO, says that despite global fashion profits plummeting in 2020, online retail apparel sales increased by 28% on average in a matter of months for nearly 20 brands using their platform.
“People buy from people at the end of the day,” says Myhrvold, a former schoolteacher turned serial entrepreneur, who believes similar empathy applies to education as well as to retail.
“86% of customers prefer dealing with a person over a chatbot. And that’s why this personalised emotional purchasing experience truly works.”
Chalhoub Middle East represents luxury retail clients including Lancôme and Faces. The regional distributor says humanising online platforms welcomed a new wave of shoppers during lockdown measures.
“It helped us survive,” says Aleksandra Harciarek, an omni-channel project manager at Chalhoub Group.
“It helped us generate results that would not be there if we would stay, just waiting for our IT to build e-commerce platforms for us.”
While the retail industry creates technology with more relatable output, a demand to input human values in complex systems such as machine learning systems is growing.
Technology including artificial intelligence is dependent on the information its developers teach it, which can lead to human issues of bias and unethical practices if not regulated.
In 2015, Amazon’s AI recruiting engine determined that resumes with the word “women’s” were less preferable.
A 2018 MIT study found facial recognition algorithms had higher detection errors towards Black people compared to White people.
After opposition protests, Microsoft, IBM, and Amazon refused to sell their facial analysing software to US police in 2020, for fear of racial profiling and mass surveillance.
Despite this, only 25% of businesses consider ethical implications before investing in AI according to Price Waterhouse Coopers (PwC), which estimates that the technology will contribute about 13 trillion euros to the global economy by 2030.
Datumcon trains AI neural networks to ensure safe oil drilling, track mask and social distancing compliance, and detect human emotion with sentiment analysis.
Cesar Andres Lopez, Datumcon’s CEO, says AI is superior to humans at performing specific tasks such as differentiating between a cat and a dog. Human guidance is essential for more complex tasks such as detecting emotions, yet he adds, it can welcome human bias at the same time if not trained properly.
“This is one of the big problems that we have in the future,” says Lopez, who explains that the key to preventing bias is using localisation to train neural networks how to think.
“We make sure that the data tagging is coming from local people because we want to be able to reflect the local characteristics of the data,” he says, using data sets that truly represent a society’s class, age, ethnic, and gender variety.
Datumcon enters samples from about 30 000 various faces to teach the detection of each basic emotion used in their sentiment analysis technology.
Once completed, the results are overseen by an in-house psychologist for accuracy.