August 12, 2025
I've got the NLP running I tried using ML but without a CUDA GPU that takes too long so stuck to sentence transformers using hugging face AI. The uncategorised reviews WILL be categorised using GPT 2. So yeah. I'll create my next devlog once I've got everything running in 1 button from product ---> seller reviews AI analysed.
I've got the page number and the origins removed from the variables I'm checking so now I'm left with just the questions, marks and question number.
Question Number and Mark has been appended to a dictionary. Next I need the actual question with it as well
I'm building a smart exam assistant that takes exam papers (PDFs), extracts each question and its answer, and stores them in a subject-specific database. Users can submit questions to be added, but I approve them first to keep quality high. Once approved, everyone shares access to a massive question bank. Users can search for questions using keywords—like "inertial mass"—and generate custom quizzes based on what they want to study. They can even choose how many questions they want, and the system will turn it into a clean, downloadable PDF quiz. It’s like having a personalized exam generator, powered by a crowd-sourced, curated question database.
Sorry for the terrible mic first time trying to use it while screen recording😭I'll use a different one next time. Anyways I can translate the entire database under 30 seconds for now(foreshadowing🫡).Combined all code into 1 script so I just need to press 1 button to start the whole thing, instead of running the feedback url finder then running the feedback scraper and then running the translater. My next task will be to create the AI summariser model and user input.
Most of the time I can evade captcha by just doing it slow enough that it thinks I'm human... wow eBay really does believe we're inferior to bots😭. Anyways I've got scraping working and have successfully scraped 400 reviews including actually comment and whether it was positive or not. Now I need to work on either art or actually combining my 2 scarping scripts so that it goes Product ---> Reviews instead of Product URL-> Feedback URL -> Scrape. Nahh need to make that one big motion + I need to put some actual headers for my scraping driver so users don't actually see the browser I'm scraping on 💀. After this I've seen a big problem where the reviews can be written in different languages so... need to get this massive database translated as well 🫡 even worse 400 is just testing. When shipped I'm going to increase it 2000. We also do seller second chances so we can just take in their reviews from the last 6 months. Yes it may seem I can't scrape dates but it's in another file 🤫. After I translate this database I need to put into a NLP AI to summarise all of this information and take common findings in positive and negative reviews. After that I need to connect my frontend to my backend and tweak my frontend to make it more me 😉. NEARLY DONE
Scraping Final Boss: Captcha 💀
I have no idea how to bypass Captcha or even how I'm triggering it probably cause I'm running it too fast but I don't really know. The good news is that I do scrape feedback every now and then when the captcha doesn't pop up. I literally scraped 200 that time and I'll show it on my next dev log.
YOO. So a lot has happened. I think I've done the most of the entire project in the past 2 hours(only counted as 24 😭 ). I've got selenium clicking on all the right things and did some url exploiting. I've also found out that feedback username is different to their seller username on the product.... maybe related to the product they sell and have multiple shops set up. Anyways. All I need to do now is read the information on that last url and read all of the feedback and get AI to analyse common negative reviews and positive. So yeah...80% done.
Looks like its working at the start but it turns out ebay removed the next page button at the last page so now I need to find out if im scraping the same page again(My tutorial is outdated)... might just end up putting an if statement if its reached page 10 cus that's enough for me
Using Cursor now instead of VS code (; . But even cursor is falling for Ebay's lil party tricks:
Excellent! 🎉 Now it's working! The debug information shows:
✅ Function is being called correctly
✅ Getting a 200 response from eBay US
✅ Found an item URL: https://ebay.com/itm/123456?...
✅ Successfully extracted item ID: 123456
✅ Returning success: true
Bro thinks 123456 is an actual item 😂. Look it might be but Cursor is looking for a phone 💀
This project's biggest nightmare is the use of my API keys so I'm just going to switch to scraping. My keys used to work(yesterday) but it's to much effort to get them to work. So I'm going to start making a lot of ebay accounts to act as my 'user access token' instead of an actual API.
So I've used some sneaky AI now... It's really good and no one talks about it. Of course I'm not going to say it but this link for my projects says enough: https://ebay-review-whisperer.lovable.app
🧠 State of Mind
After Devlog 3, I hit coding beast mode. Now? I’m cooked. Sleep is calling. But today was a breakthrough day.
🔍 Search Function Upgrade
Built a new search function using the eBay API to browse listings.
Product links are officially out—now users search by name, and the system analyzes the most relevant result.
This change makes the experience smoother and more dynamic.
🔑 API Key Chaos
Discovered that I need multiple API keys for different endpoints—more than just three for one site.
Realized that regional targeting matters: searching within my region yields more accurate seller info.
Solved a major issue where my user access token was invalid. Fixed it by signing into my real eBay account and scripting a fresh token call.
📊 Seller Feedback (WIP)
Currently working on integrating seller feedback.
Most sellers return info like:
username: techmobile4u
feedbackScore: 34585
feedbackPercentage: 99.9
However, the feedback API is currently unresponsive, so I’m not able to fetch live feedback data yet.
✅ API Wins
Lots of successful calls today—finally seeing results.
The system is starting to feel robust, even if the feedback endpoint is being stubborn.
Today’s mission: extract feedback percentage, score, and username from a lone product link. Sounds simple. Reality? A whole lotta null. The data gods are not smiling.
🔍 Progress Report:
Refactored the codebase—cleaned it up like a digital spring cleaning.
Modularized the chaos: started wrapping logic into neat little functions.
Took the plunge into the eBay Developer Program and began using their free APIs. (Because scraping is so last season.)
😤 Current Mood: Disappointed, but not done. The APIs are playing hard to get, and the data’s ghosting like it owes you money. But you’re not giving up. You’re the kind of dev who stares into the void and says, “I’ll refactor you.”
🧠 Next Steps:
Double-check API endpoints and permissions—sometimes they’re sneakier than bugs.
Log everything. Even the logs. Null might be hiding behind a bad request or missing header.
Maybe throw in a test with a different product link—just to see if the issue’s with the item, not the code.
So turns out the reason my PC was throwing tantrums wasn't some deep, mysterious bug—it was me. I slapped in a print() instead of an input() like a true coding gremlin, and the program just kept screaming into the void. 💀
Fixed it by yeeting the input() into the while loop and leaving the print() where it was. Simple fix for a bug that felt like it was trying to assassinate my entire setup.
Anyway, I’ve now imported re to try and read eBay links and extract the item ID. Spoiler alert: my human eyes can't even find the ID in those URLs, so I’m letting the code do the detective work. 🕵️♂️
But honestly? Parsing eBay links manually is like trying to read hieroglyphics while blindfolded. So I’m calling in the big guns—API key time. Let’s gooo. 🚀
Paste any eBay product link to instantly get an AI-powered analysis of the seller’s reviews. The tool extracts seller data, fetches buyer feedback via scraping eBay, and uses NLP to summarize sentiment, highlight common issues, and generate a trust score, helping users make smarter buying decisions.
This was widely regarded as a great move by everyone.