I realize and understand the criticisms of ChatGPT and I have personally seem how bad it can be. Once I asked to count the number of days till a random date giving the present date and it failed miserably, again and again. Trust me! I get the criticism. But, what about Bing Chat Bot?

Have you ever tried to ask you Physics and Maths related questions to it? I was coding a while ago and I had a pretty complex questions which could not be solved by a very popular reddit coding community but Bing Chatbot gave an answer to it in an instant! I was genuinely impressed. Apparently it checks for answers on multiple webpages on the internet, it reads and understands what it reads and it gives the answer to it after combining the knowledge it gained from it's search. Again, the question I asked was pretty complex but it was able to answer it in an instant and it was the right answer! It was coding, it's pretty hard to get the right answer in the first try, I have found it's more "trial and error".

So yeah!

  1. Can I rely partially on Bing Chatbot for math questions?
  2. If not can I ask it to form a query which encapsulates my question perfectly?
  3. If not, should I ask it to "Answer this question and site your sources"?
  4. Can I do something more? i.e., like I did in 3? What are your thoughts on this?

I won't be able to reply to each of your comments anytime soon, but know that I deeply appreciate this community and it's members and their help :')

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    They can definitely be made to work out arithmetic and similar though

    If you were to say in the preprompt something like: When asked a mathematical question, please respond with the equations used to achieve the result

    For example if you asked it what 3x4 is it could respond with "The answer is {3x4}" and then the {3x4} could be evaluated in software afterwards and dropped in for the user to see

    I think that might be what chatGPT does now as they somewhat recently fixed it always getting maths wrong

    Or alternatively you could ask it to simply write a script to work out whatever problem it's given that isn't linguistic and execute that in a sandboxed environment (though still might be too risky incase it generates some bad code)