Reading reviews written by a robot… and seeing nothing but fire – Octopus.ca

Artificial intelligence systems can be trained to write product reviews similar to human texts, which can then prove useful to consumers, marketers, and professional reviewers, according to a Dartmouth College study. from the Tuck School of Business, and Indiana University.

The work, published in theInternational Journal of Research in Marketingalso identify ethical issues raised by the use of computer-generated content.

“Writing product reviews is difficult for both humans and computers, especially because of the large number of distinct products,” says Keith Carlson. “We wanted to see how AI can be used to help the people who write and use these reviews. »

For the works, the Dartmouth team created two challenges. The first was to determine whether a machine could be trained to write original “human-grade” reviews using only a small number of product features, after being “trained” using existing content. For the second, the research team wanted to know if machine learning algorithms could be used to write summaries of product reviews, for which there are already several reviews.

“Using AI to write and synthesize reviews can drive efficiency on both sides of the market,” judge Prasad Vana, assistant professor of business administration. “We hope that AI can help reviewers with heavy workloads, and consumers who have to sift through large amounts of content about different products. »

Specialists have focused on wine and beer reviews due to the wide availability of content to train the computer algorithms. Articles about these products also have a relatively specific vocabulary, which is an advantage when working with AI systems.

To determine if a computer could write useful reviews from scratch, the researchers fed an algorithm with around 180,000 existing wine reviews. Metadata related to aspects such as product origin, grape variety, rating and price were also used to train the machine learning system.

By comparing computer-generated reviews to those written by humans for the same wines, the research team found similarities between the two versions. The results remained consistent, even as the team challenged the algorithm by changing the amount of data available for reference.

The computer-generated content was then evaluated by non-expert participants to see if they could determine whether the reviews were written by machines or humans. According to the study, it was impossible to distinguish between the two types of content; moreover, their intention to buy a wine was similar, regardless of the nature of the criticisms about the wine in question.

Test synthesis ability

After finding that AI could write drinkable wine reviews, the research team turned to beer reviews to determine the computer’s ability to write “syntheses”. Rather than being trained to produce new content, the researchers asked the algorithm to group items together from reviews of the same product. This tested the AI’s ability to identify and deliver limited, but useful, product information based on a wide variety of opinions.

To do this, the researchers provided 143,000 reviews of over 14,000 beers to the algorithm. As with wine, the text of each review was accompanied by metadata including the name of the product, its alcohol content, the style of the drink, as well as the ratings given by the original reviewers.

Also according to the study, the computer model was indeed able to group reviews and present the essential points.

According to the research team, algorithms are not intended to replace professional reviewers and marketers, but rather to help them in their work. A computer-generated review, for example, could serve as a first draft that would then be reviewed by a human.

The work could also help consumers, say the researchers. The summaries of several reviews could be used for many products and services offered online, just to help people who do not have the time to read a large number of opinions and reviews.

However, the researchers say they are aware of the ethical issues related to the use of computer algorithms to influence the behavior of human consumers.

Aware that marketers could facilitate the acceptance of computer-generated content by passing it off as human-written text, marketers are calling for transparency when it comes to machine-generated reviews.

“As with other technologies, we need to be careful how this new development is used,” says Carlson. “If used responsibly, computer-generated reviews can become a tool to foster productivity and help make useful information available to consumers. »

Don’t miss any of our content

Encourage Octopus.ca

We would love to give thanks to the author of this write-up for this awesome material

Reading reviews written by a robot… and seeing nothing but fire – Octopus.ca


You can find our social media pages here and other pages on related topics here.https://www.ai-magazine.com/related-pages/