If its a situation where bad info can cause you to waste significant time and lose money then Copilot is not the right tool for the job. But thats not every query and I wouldnt even say its more than half our search queries.
Its boring that people are completely unwilling to engage at all with the arugment. Comparing a situation where you get the wrong info from an LLM with a situation where you get the correct info from google is pointless and shows nothing. Ignoring that people get the wrong info from google all the time is so disingenuous. Everyone would acknowledge google is still a useful resource even if there is the chance to come away with the wrong answer.
So the real comparison to make isnt correct answer from google 100% of the time vs correct answer from copilot 60% of the time. Its correct answer from google 90% of the time vs copilots 70%. So you can weigh up the options and use the right tool for the job.
We can acknowledge that assistants like siri are good even though they have had data privacy issues. Youtube has privacy issues but I still use it and consider it a good service. Lemmy has privacy issues and I still use it and consider it a good service.
Overall Siri is a good feature is good and a majority of the users use it.
No its not. Firstly 99% of people have no idea what that button is.
Secondly opening a web browser and going to google typing in your question then pressing 'im feeling lucky' then searching through the webpage is way slower than hitting the copilot button typing your question and getting a quick direct answer.
I think you're getting confused by the marketing. The marketing makes it out to be this useful thing that will do your work for you which it cant. It doesnt have features its just an LLM. You ask question, it returns answer, its really not that much more. The productivity increase comes from people getting fast answers to their questions and quick templates for written work.
I'm finding it funny how many people disagree with the line about information retrieval. We get a ton of untrustworthy information all the time and we know there is a chance of it being wrong and we weigh the consequences vs the extra effort it will take to verify. If im about to stake my career on a fact im not going to rely on chatGPT but if I need to see some popular UI frameworks then chatGPT is fine. If its wrong thats fine there is nothing riding on it I just move on and check the next one.
Thats fair and Gemini may be better but I dont think the difference in quality is make or break conceptually. They both fill the purpose enough for me to see that the feature has potential is there even if Gemini would have been a better choice.
Unless you think people always come away from google with the right answer I dont see the 1:1.
If you NEED the right answer you should go to a trusted source same as if you're using google. If you are looking for an answer then usually blogspam articles, reddit, or AI will all be good enough to return something satisfying. AI is just a faster way of searching a question on google and clicking thte top result.
specifically NEED. Very few things in our day to day life NEED to be correct. Typically its good to be correct but our usecases can handle being wrong because its either low stakes or we will be diving deeper into the topic as we narrow down our information search.
We do these kinds of searches all the time. Everytime you ask an average person a question you're preforming one of these searches. Everytime something pops into your head and you want a quick answer you're preforming one of these searches. When you search for information online you're generally preforming one of these kinds of searches.
An example could be I want to know a few of the popular python libs for interacting with atlassian. It gives me a list of some libs and links and I can go check them out.
That actually says the opposite. That link says that Obama didnt intend to strike the hospital nor did the general and it was struck due equipment malfuntion and the info relayed to the strike crew causing it to get identified as a similar looking Taliban-controlled building. So a lower level chain of command and procedure failure. Operating in that theater and ordering those strikes was still the correct thing to do.
You can even look at the response once they realised the mistake. Trump's admin would never do anything close to that.
You do realise there is another half of the country that opposes this right? Even if 55% of the country tries to throw trump out 45% will show up to stop him and that will be a very bloody conflict.
Better just to wait until mid terms and next election and vote him out.
Bombing a country isnt inherently a bad thing. Those strikes were conducted with the support of regional allies against terrorist orgs. But I know people like you never miss an opportunity to take attention away from Trump and point it towards democrats.
If its a situation where bad info can cause you to waste significant time and lose money then Copilot is not the right tool for the job. But thats not every query and I wouldnt even say its more than half our search queries.
Its boring that people are completely unwilling to engage at all with the arugment. Comparing a situation where you get the wrong info from an LLM with a situation where you get the correct info from google is pointless and shows nothing. Ignoring that people get the wrong info from google all the time is so disingenuous. Everyone would acknowledge google is still a useful resource even if there is the chance to come away with the wrong answer.
So the real comparison to make isnt correct answer from google 100% of the time vs correct answer from copilot 60% of the time. Its correct answer from google 90% of the time vs copilots 70%. So you can weigh up the options and use the right tool for the job.