admin管理员组文章数量:1123341
I’m trying to implement audio transcription generated by ChatGPT and display it in a LiveKit video with a camera, allowing the user to chat with the bot. The problem is that, at the moment, I can only list the local transcription (the audio sent by the user). I’m unable to access the remote transcription, i.e., the one generated by the bot.
I’ve identified a method called _onTranscriptionEvent(EngineTranscriptionReceivedEvent event)
that returns both the user’s and the bot’s transcription. However, since this method is private and outside the main scope, I don’t know how to access this stream from another class.
My goal is to list both local and remote transcriptions in real-time.
Here are the main points of the code I’m trying to use:
In the LiveKit repository, the file participant.dart (line 95) handles local audio transcription: .dart
In the file room.dart (line 817), the audio transcription generated by ChatGPT (remote) is processed: .dart
My issue is how to access this method (_onTranscriptionEvent) or the corresponding stream from another class, given that it is private. I imagine there must be a public method or another way to access this functionality.
Does anyone know how I can solve this?
本文标签:
版权声明:本文标题:stream - Audio-to-Text Transcription Method with ChatGPT, LiveKit, and Flutter - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1736564885a1944690.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论