Waspada Chatgpt and other AI have created fictitious tourist destinations
Ilustrasi (Freepik)

Artificial intelligence (AI) is now a friend of many tourists. Tools such as chatgpt, Microsoft Copilot, Google Gemini, to special applications such as Wonderplan and Layla are used by millions of people to design their trips. Quoting from the BBC, a survey in 2024, around 30% of international travelers have relying on AI to find destinations, make an itinerary, to calculate the cost of vacation.

However, behind the ease, a serious problem arises that not all information provided is correct. Some are even purely the result of “hallucinations”, namely when AI composed the details of the place or travel route. This is not just confusing, but can be fatal for tourists who just believe.

Fictitious destination in Peru

One case occurred in Peru. Miguel Angel Gongora Meza, the founder and Director of the Peru Treks Evolution Treks, told a unique experience when he met two foreign tourists in a mountainous village. They plan to climb to the “Sacred Canyon of Humantay”, a location that has never existed.

“They showed screenshots from chatgpt written very convincingly, full of beautiful descriptions. In fact, there is no place like that,” said Gongora Meza.

According to him, the name is only a combination of two different locations that have nothing to do. Even worse, the tourist has paid nearly US $ 160 just to get to the rural road around Mollegata, Peru without a guide and without a clear purpose. Gongora Meza explained, this kind of mistake could endanger lives.

“In Peru, the height can reach 4,000 meters, the weather changes fast, the path is difficult to access, and the telephone signal is almost non -existent. If the misinformation, the risk is very large,” he explained.

Trapped at the top of the Japanese mountain

Other cases befall the Dana Pair Yao and her husband in Japan. They used chatgpt to plan romantic climbing to Mount Misen on Itsukushima Island. After enjoying the city of Miyajima, they climbed at 15.00 to chase the sunset, according to AI’s instructions.

But the problem arises when it comes down. ChatGPT tells the last cable car operating up to 17:30. In fact, the cable car was closed early.

“We were finally trapped at the top of the mountain without transportation,” said Yao, a travel blogger in Japan.

Eiffel Tower in Beijing and Marathon Cross Italia

BBC in 2024 also reported similar irregularities from Layla, AI -based travel application. This application had mentioned that there was an Eiffel Tower in Beijing, and recommended the Marathon route from the Northern Italian crossing that never existed.

“The itiner does not make sense. We will spend more time on transportation rather than really enjoying the trip,” said a British tourist who became a victim quoting from the same page.

According to a survey in the same year, 37% of AI users for trips feel the information is incomplete, while the other 33% mention the recommendations they can contain false information.

Professor Machine Learning at Carnegie Mellon University, Rayid Ghani explained this problem was rooted in the workings of AI. “Ai does not know the difference between recipes, road directions, or tourism suggestions. He only sorts words so that it sounds reasonable,” he said.

Large language models (LLM) such as chatgpt are built by analyzing millions of text and producing answers based on patterns. Sometimes the results are true, but often misleading, namely information that is fully made up. Because AI presents facts and hallucinations in the same way, users find it difficult to distinguish what is real.

In the case of “Sacred Canyon of Humantay”, for example, AI is likely to only compose words that sound according to the region. Ghani added, despite being able to analyze large data, AI has no understanding of the physical condition of the world. He could have equalized a relaxing road as far as 4 km with climbing 4,000 meters in the mountains. (BBC/Z-2)

#Waspada #Chatgpt #created #fictitious #tourist #destinations