加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 编程开发 > asp.Net > 正文

.net-core – WebRTC和Asp.NetCore

发布时间:2020-12-16 06:55:39 所属栏目:asp.Net 来源:网络整理
导读:我想将Angular Web App中的音频流录制到我的 Asp.net Core Api. 我认为,使用SignalR及其websockets这是一个很好的方法. 使用此打字稿代码,我可以获得MediaStream: import { HubConnection } from '@aspnet/signalr';[...]private stream: MediaStream;priva
我想将Angular Web App中的音频流录制到我的 Asp.net Core Api.

我认为,使用SignalR及其websockets这是一个很好的方法.

使用此打字稿代码,我可以获得MediaStream:

import { HubConnection } from '@aspnet/signalr';

[...]

private stream: MediaStream;
private connection: webkitRTCPeerConnection;
@ViewChild('video') video;

[...]

navigator.mediaDevices.getUserMedia({ audio: true })
  .then(stream => {
    console.trace('Received local stream');
    this.video.srcObject = stream;
    this.stream = stream;

    var _hubConnection = new HubConnection('[MY_API_URL]/webrtc');
    this._hubConnection.send("SendStream",stream);
  })
  .catch(function (e) {
    console.error('getUserMedia() error: ' + e.message);
  });

我用.NetCore API处理流

public class MyHub: Hub{
    public void SendStream(object o)
    {
    }
}

但是当我将o转换为System.IO.Stream时,我得到了一个null.

当我阅读WebRTC的文档时,我看到了有关RTCPeerConnection的信息. IceConnection ……我需要吗?

如何使用SignalR将音频从WebClient流式传输到Asp.netCore API?文档? GitHub的?

谢谢你的帮助

解决方法

我找到了访问麦克风流并将其传输到服务器的方法,这里是代码:

private audioCtx: AudioContext;
  private stream: MediaStream;

  convertFloat32ToInt16(buffer:Float32Array) {
    let l = buffer.length;
    let buf = new Int16Array(l);
    while (l--) {
      buf[l] = Math.min(1,buffer[l]) * 0x7FFF;
    }
    return buf.buffer;
  }

  startRecording() {
    navigator.mediaDevices.getUserMedia({ audio: true })
      .then(stream => {
        this.audioCtx = new AudioContext();
        this.audioCtx.createMediaStreamSource(stream);
        this.audioCtx.onstatechange = (state) => { console.log(state); }

        var scriptNode = this.audioCtx.createScriptProcessor(4096,1,1);
        scriptNode.onaudioprocess = (audioProcessingEvent) => {
          var buffer = [];
          // The input buffer is the song we loaded earlier
          var inputBuffer = audioProcessingEvent.inputBuffer;
          // Loop through the output channels (in this case there is only one)
          for (var channel = 0; channel < inputBuffer.numberOfChannels; channel++) {

            console.log("inputBuffer:" + audioProcessingEvent.inputBuffer.getChannelData(channel));
            var chunk = audioProcessingEvent.inputBuffer.getChannelData(channel);
            //because  endianness does matter
            this.MySignalRService.send("SendStream",this.convertFloat32ToInt16(chunk));
          }
        }
        var source = this.audioCtx.createMediaStreamSource(stream);
        source.connect(scriptNode);
        scriptNode.connect(this.audioCtx.destination);


        this.stream = stream;
      })
      .catch(function (e) {
        console.error('getUserMedia() error: ' + e.message);
      });
  }

  stopRecording() {
    try {
      let stream = this.stream;
      stream.getAudioTracks().forEach(track => track.stop());
      stream.getVideoTracks().forEach(track => track.stop());
      this.audioCtx.close();
    }
    catch (error) {
      console.error('stopRecording() error: ' + error);
    }
  }

下一步是将我的int32Array转换为wav文件.

帮助我的消息来源:

> https://subvisual.co/blog/posts/39-tutorial-html-audio-capture-streaming-to-node-js-no-browser-extensions/
> https://medium.com/@yushulx/learning-how-to-capture-and-record-audio-in-html5-6fe68a769bf9

注意:我没有添加如何配置SignalR的代码,这不是目的.

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读