This may simply be grammatical so I wanted to confirm it was designed to be this way.
Other ai code sidekick chats ask “Do you want me to do xyz?”
Then when they execute they say something like “I am now doing xyz” and log when complete
I noticed Gemini states “I will do xyz”, then nothing happens
So after a few seconds I tried typing “ok” and it moved forward
It’s likely this is normal behavior although it lacks prompting the person to confirm
Far from a big deal but if the statement was turned into a question it may guide the user
Hi @Sea - that sounds like a bug. I’ve tagged @ali for awareness.
Are you still seeing Gemini give similar delays before needing to be nudged to give an answer?
It is loads better although still intermittent on whether it gives an answer, asks for confirmation, provides a button to execute (<===ideal imho), says it cannot answer or simply does nothing.
EDIT I have noticed now when I paste an error into the prompt when troubleshooting sometimes it does not provide any feedback or context into the change it wants you to make. Without that feedback, the user is unable to either learn why something was an error or determine whether or not the solution it is providing is reasonable. I note this because I currently have a window up going down a rabbit hole of ineffective solutions and about half are just click the button to execute.
Can you please file a bug: http://issuetracker.google.com/issues/new?component=1379083&template=1836320
Include a screenshot of the responses. We’ll take a look at it. Please do share the link to your filed bug report to help me track it as well.