admin管理员组

文章数量:1123341

I’m trying to implement audio transcription generated by ChatGPT and display it in a LiveKit video with a camera, allowing the user to chat with the bot. The problem is that, at the moment, I can only list the local transcription (the audio sent by the user). I’m unable to access the remote transcription, i.e., the one generated by the bot.

I’ve identified a method called _onTranscriptionEvent(EngineTranscriptionReceivedEvent event) that returns both the user’s and the bot’s transcription. However, since this method is private and outside the main scope, I don’t know how to access this stream from another class.

My goal is to list both local and remote transcriptions in real-time.

Here are the main points of the code I’m trying to use:

In the LiveKit repository, the file participant.dart (line 95) handles local audio transcription: .dart

In the file room.dart (line 817), the audio transcription generated by ChatGPT (remote) is processed: .dart

My issue is how to access this method (_onTranscriptionEvent) or the corresponding stream from another class, given that it is private. I imagine there must be a public method or another way to access this functionality.

Does anyone know how I can solve this?

本文标签: