.net-core – WebRTC和Asp.NetCore
我想将Angular Web App中的音频流录制到我的
Asp.net Core Api.
我认为,使用SignalR及其websockets这是一个很好的方法. 使用此打字稿代码,我可以获得MediaStream: import { HubConnection } from '@aspnet/signalr'; [...] private stream: MediaStream; private connection: webkitRTCPeerConnection; @ViewChild('video') video; [...] navigator.mediaDevices.getUserMedia({ audio: true }) .then(stream => { console.trace('Received local stream'); this.video.srcObject = stream; this.stream = stream; var _hubConnection = new HubConnection('[MY_API_URL]/webrtc'); this._hubConnection.send("SendStream",stream); }) .catch(function (e) { console.error('getUserMedia() error: ' + e.message); }); 我用.NetCore API处理流 public class MyHub: Hub{ public void SendStream(object o) { } } 但是当我将o转换为System.IO.Stream时,我得到了一个null. 当我阅读WebRTC的文档时,我看到了有关RTCPeerConnection的信息. IceConnection ……我需要吗? 如何使用SignalR将音频从WebClient流式传输到Asp.netCore API?文档? GitHub的? 谢谢你的帮助 解决方法
我找到了访问麦克风流并将其传输到服务器的方法,这里是代码:
private audioCtx: AudioContext; private stream: MediaStream; convertFloat32ToInt16(buffer:Float32Array) { let l = buffer.length; let buf = new Int16Array(l); while (l--) { buf[l] = Math.min(1,buffer[l]) * 0x7FFF; } return buf.buffer; } startRecording() { navigator.mediaDevices.getUserMedia({ audio: true }) .then(stream => { this.audioCtx = new AudioContext(); this.audioCtx.createMediaStreamSource(stream); this.audioCtx.onstatechange = (state) => { console.log(state); } var scriptNode = this.audioCtx.createScriptProcessor(4096,1,1); scriptNode.onaudioprocess = (audioProcessingEvent) => { var buffer = []; // The input buffer is the song we loaded earlier var inputBuffer = audioProcessingEvent.inputBuffer; // Loop through the output channels (in this case there is only one) for (var channel = 0; channel < inputBuffer.numberOfChannels; channel++) { console.log("inputBuffer:" + audioProcessingEvent.inputBuffer.getChannelData(channel)); var chunk = audioProcessingEvent.inputBuffer.getChannelData(channel); //because endianness does matter this.MySignalRService.send("SendStream",this.convertFloat32ToInt16(chunk)); } } var source = this.audioCtx.createMediaStreamSource(stream); source.connect(scriptNode); scriptNode.connect(this.audioCtx.destination); this.stream = stream; }) .catch(function (e) { console.error('getUserMedia() error: ' + e.message); }); } stopRecording() { try { let stream = this.stream; stream.getAudioTracks().forEach(track => track.stop()); stream.getVideoTracks().forEach(track => track.stop()); this.audioCtx.close(); } catch (error) { console.error('stopRecording() error: ' + error); } } 下一步是将我的int32Array转换为wav文件. 帮助我的消息来源: > https://subvisual.co/blog/posts/39-tutorial-html-audio-capture-streaming-to-node-js-no-browser-extensions/ 注意:我没有添加如何配置SignalR的代码,这不是目的. (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |
- asp.net – 在页面加载中实现IsPostBack
- 在ASP.NET类中访问响应对象
- asp.net-mvc – ASP.NET MVC如何链接视图和控制器?
- asp.net – 什么是部分回发?
- asp.net – 在新的VS 2013 Identity UserManager中动态添加
- asp.net – 如何自定义网站项目的构建?
- asp.net-mvc – 使用Web API不检查的并发检查
- asp.net-mvc – 要查看的ASP.NET MVC 2控制器:IList还是Li
- .net – 更好地理解Orchard的’形状’概念
- asp.net-core – 哪些所有类型的HTTP头都在ASP.NET 5中?