Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Issue]: FSM response not in real time #12

Open
wi0lono opened this issue Apr 9, 2024 · 17 comments
Open

[Issue]: FSM response not in real time #12

wi0lono opened this issue Apr 9, 2024 · 17 comments
Assignees
Labels
bug Something isn't working

Comments

@wi0lono
Copy link
Collaborator

wi0lono commented Apr 9, 2024

Describe the issue

The FSM doesn't stream output (return it in real time). Output is returned only when the fsm waits for input or after it has exited.

Steps to reproduce

No response

Screenshots and logs

No response

Additional Information

No response

@varun-heman varun-heman added the enhancement New feature or request label Apr 10, 2024
@shreypandey
Copy link
Collaborator

@wi0lono Can you please share more details regarding the issue? Do you mean output returned to channel?

@wi0lono
Copy link
Collaborator Author

wi0lono commented Apr 10, 2024

Yes.
Within my FSM, I have:

self.send_message("message1")
# long running process
self.send_message("message2")

Ideally message1 should be returned to the user as soon as the first send_message() runs, but it seems to be returning them all at once, when the FSM waits for input/ends.

@shreypandey shreypandey added bug Something isn't working and removed enhancement New feature or request labels Apr 12, 2024
@sameersegal
Copy link
Collaborator

At the moment this should be handled in FSM design. If you break this into 2 different states:

state n:
  self.send_message("message1")
  # automatically move forward

state n+1:
  # long process
  self. send_message("message 2")

@sameersegal sameersegal added the wontfix This will not be worked on label Jun 9, 2024
@sameersegal
Copy link
Collaborator

sameersegal commented Jun 9, 2024

We should create a separate item for streaming version of JB-Manager that can work with custom channels that support streaming interface.

My sense is that this is going to be a significant change in architecture (creating a v3) and therefore we should do this as a prototype outside of this repo first.

@shreypandey
Copy link
Collaborator

@sameersegal

At the moment this should be handled in FSM design. If you break this into 2 different states:

state n:
  self.send_message("message1")
  # automatically move forward

state n+1:
  # long process
  self. send_message("message 2")

The suggested approach won't work as currently output from FSM reaches flow main process only when

  1. FSM waits for user/callback input
  2. FSM reaches end state

Dividing into separate state won't solve the issue of getting FSM output in real time as the output buffers before going to flow.
This behavior is due to the fact that subprocess.run only returns when the fsm runner process completes(which happens in one of the above mentioned conditions).

One way to solve this issue would be to use FIFO pipe to communicate between FSM runner process and flow main process.

@shreypandey shreypandey reopened this Jun 9, 2024
@sameersegal
Copy link
Collaborator

sameersegal commented Jun 10, 2024

I understand it better now. Thanks

Let's look at this in a bit

@Lekhanrao
Copy link
Collaborator

@sameersegal and @shreypandey, could you please provide an update on this?

@KaranrajM
Copy link
Contributor

Need more information to be converted to C4GT issue template.

@KaranrajM
Copy link
Contributor

The below issue can also be fixed with this fix:

Describe the bug:
While in a state, if the user sends multiple messages, the bot can end up producing junk and sending back a range of responses, sometimes even resetting the state.
Steps to reproduce

In any state, send more than one message to the bot.
Expected Behavior

Bot should ideally terminate earlier request and resend request with new context/messages and process once.

@shreypandey shreypandey removed the wontfix This will not be worked on label Aug 1, 2024
@Lekhanrao
Copy link
Collaborator

This is a big/complex Architectural change. Design needs to be done. So will need further detailed discussions on the way forward.

@Lekhanrao
Copy link
Collaborator

Varun suggested another option with thinking of Time as a Delimiter - take all requests from the User until the user stops and then process everything together as a single message. @KaranrajM and @DevvStrange to brainstorm and get back with their thoughts by this Friday (18th Oct)

@Lekhanrao
Copy link
Collaborator

@DevvStrange , You were brainstorming on this topic, isnt it? Any update?

@DevvStrange
Copy link
Contributor

@Lekhanrao Yes we did analyse on the solution that was shared by Varun, the takeaway was like setting the time as a delimiter would not be a good user experience, as every request waits for certain time say 10 seconds after which it starts the processing. Also this time of 10 seconds might not be ideal for sending a followup message from user's perspective as well. So we should brainstorm further for a better solution.

@shreypandey
Copy link
Collaborator

Hi @DevvStrange,
This issue is due to the fact that subprocess.run call in bot_input.py only returns after fsm_runner finishes execution. You can explore pipe based approach in subprocess call which actually streams the response in real time. This will handle this issue.
Something like this:

process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)

# Stream the output
try:
    for line in iter(process.stdout.readline, ''):
        print(line, end='')  # Output in real-time
finally:
    process.stdout.close()
    process.stderr.close()
    process.wait()

@Lekhanrao
Copy link
Collaborator

@DevvStrange, wanted to check if you have been look into Shrey's comments above? And based on that what would be our next steps?

@DevvStrange
Copy link
Contributor

We will have an internal catchup and if required will connect with Shrey and take this further.

@Lekhanrao
Copy link
Collaborator

@DevvStrange do we have a plan now?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: Done
Development

No branches or pull requests

7 participants