You don’t need “AI” for that. All you would need is some standardized APIs for the various shops, and you could easily solve this with computer technology from 20 years ago.
It’s not that there’s no API. It’s that there’s probably a different API for every single grocery store. And they make random changes and don’t have public documentation. That’s why we need the AI.
Indeed. LLMs read with the same sort of comprehension that humans have, so if a supermarket makes their website compatible with humans then it’s also compatible with LLMs. We have the same “API”, as it were.
That sounds like an issue with your system prompt. If you’re using an LLM to interpret web pages for price information then you’d want to include instructions about what to do if the information simply isn’t in the web page to begin with. If you don’t tell the AI what to do under those circumstances you can’t expect any specific behaviour because it wouldn’t know what it’s supposed to do.
I suspect from this comment that you haven’t actually worked with LLMs much, and are just going off the general “lol they hallucinate” perception they have right now? I’ve worked with LLMs a fair bit and they very rarely have trouble interpreting what’s in their provided context (as would be the case here with web page content). Hallucinations come from relying on their own “trained” information, which they recall imperfectly and often gets a bit jumbled. To continue using a human analogy, it’s like asking someone to rely on their own memory rather than reading information from a piece of paper.
You don’t need “AI” for that. All you would need is some standardized APIs for the various shops, and you could easily solve this with computer technology from 20 years ago.
The reality is, though, that there are no such APIs. LLMs on the other hand could be a valid tool for the use case.
LLMs are not a good tool for processing data like this. They would be good for presenting that data though.
Llms are great for scraping data
LLMs don’t scrape data, scrapers scrape data. LLMs predict text.
It’s not that there’s no API. It’s that there’s probably a different API for every single grocery store. And they make random changes and don’t have public documentation. That’s why we need the AI.
Indeed. LLMs read with the same sort of comprehension that humans have, so if a supermarket makes their website compatible with humans then it’s also compatible with LLMs. We have the same “API”, as it were.
deleted by creator
That sounds like an issue with your system prompt. If you’re using an LLM to interpret web pages for price information then you’d want to include instructions about what to do if the information simply isn’t in the web page to begin with. If you don’t tell the AI what to do under those circumstances you can’t expect any specific behaviour because it wouldn’t know what it’s supposed to do.
I suspect from this comment that you haven’t actually worked with LLMs much, and are just going off the general “lol they hallucinate” perception they have right now? I’ve worked with LLMs a fair bit and they very rarely have trouble interpreting what’s in their provided context (as would be the case here with web page content). Hallucinations come from relying on their own “trained” information, which they recall imperfectly and often gets a bit jumbled. To continue using a human analogy, it’s like asking someone to rely on their own memory rather than reading information from a piece of paper.
Or you could just prompt it to not guess prices for articles that don’t exist. Those models are pretty good at following instructions.
deleted by creator