Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iOS max encoders limitation issue #17

Closed
yahmad opened this issue Jun 20, 2018 · 8 comments
Closed

iOS max encoders limitation issue #17

yahmad opened this issue Jun 20, 2018 · 8 comments
Assignees

Comments

@yahmad
Copy link

yahmad commented Jun 20, 2018

This issue is mentioned in the Twilio changelog under known issues:

Typically, a maximum of three H.264 encoders can be used at once. When this limit is exceeded no errors are raised and new video Tracks are not encoded.

I just wanted to have this tracked against an issue, and to find out what the status is? (I can't find a related open issue). Sorry for being a nuisance. 😉

The issue is quiet severe, as it limits the ability to create cross-platform multi-party P2P rooms, as only an Android device can provide video tracks to more than three other participants. Can you give any progress updates on this issue, or an ETA on a fix?

Video iOS SDK

TwilioVideo v2.1 via CocoaPods

@ceaglest ceaglest self-assigned this Jun 20, 2018
@ceaglest
Copy link
Contributor

ceaglest commented Jun 20, 2018

Hi @yahmad,

I just wanted to have this tracked against an issue, and to find out what the status is? (I can't find a related open issue). Sorry for being a nuisance. 😉

Not at all, there was no pre-existing Github issue.

The issue is quiet severe, as it limits the ability to create cross-platform multi-party P2P rooms, as only an Android device can provide video tracks to more than three other participants.

The limitation on creating 3 encoders exists in VideoToolbox.framework and is hardware specific. From what we've been told by Apple engineers the iPhone X is not subject to this fixed limit, and multiplexes the hardware resources if you ask for many simultaneous encoders. We have a test case which we will update to validate this on the iPhone X. Right now we just make sure that each device we test on can encode up to 3 video streams in H.264.

I believe there is a way to fix this for all devices, but in order to do this we need some internal WebRTC changes that are coming as a part of 2.3.0.

  1. If an H.264 encoder is preferred, and can be created then use it.
  2. If not, fail and allow the system to fallback to using the next preferred codec (often VP8).

Can you give any progress updates on this issue, or an ETA on a fix?

We are going to have a short preview and beta period for 2.3.0 because it involves an upgrade of the version of WebRTC that we depend on. Expect 2.3.0-preview1 in the next few weeks. Once that release has landed we can start taking a look at this issue again in the lead up to 2.3.0.

In the meantime, the only workaround I can suggest is to either:

  1. Use a Group Room for larger scale multi-party. You will only need to encode one H.264 stream per VideoTrack shared.
  2. Don't prefer H.264 on your iOS clients (or special case to iPhone X).

Regards,
Chris

@yahmad
Copy link
Author

yahmad commented Jun 20, 2018

@ceaglest Thanks for the quick response - and the useful feedback!

I will make a note to test on the iPhone X, and keep an eye out for the preview to give it a test. 👍

@ceaglest
Copy link
Contributor

ceaglest commented Aug 8, 2018

Hi @yahmad,

We have now released 2.3.0 which is based upon WebRTC 67. While I didn't get a chance to work on this bug directly, the updated WebRTC internals give us a lot more power over how video codecs are managed. I still think this problem is solvable, so I'll be doing some more investigation on it next week.

Thanks for your patience,
Chris

@yahmad
Copy link
Author

yahmad commented Aug 9, 2018

Thanks for the update, Chris. A fix for this would be very useful.

Yasir

@Girish-iweb
Copy link

Girish-iweb commented Apr 1, 2019

hello
I am follow video call quick step by step
using TwilioVideo 2.8 latest cocoa pod I have facing trouble of audio
I have not hearing video call audio I just show them and it is automatic destroy remote view preview
I have trouble I have working in swift 4.0

func connectionTwilio()
{

     self.prepareLocalMedia()
    let connectOptions = TVIConnectOptions.init(token: accessToken) { (builder) in
        if let videoTrack = self.localVideoTrack {
            builder.videoTracks = [videoTrack]
        }
        if let audioTrack = self.localAudioTrack {
            builder.audioTracks = [audioTrack]
        }
        // Use the preferred audio codec
        if let preferredAudioCodec = Settings.shared.audioCodec {
            builder.preferredAudioCodecs = [preferredAudioCodec]
        }
        // Use the preferred video codec
        if let preferredVideoCodec = Settings.shared.videoCodec {
            builder.preferredVideoCodecs = [preferredVideoCodec]
        }
        // Use the preferred encoding parameters
        if let encodingParameters = Settings.shared.getEncodingParameters() {
            builder.encodingParameters = encodingParameters
        }
        builder.roomName = self.roomName
    }
   // APIManager.sharedInstance.centerToast(messsage: "Connecting...", view:self.view)
    room = TwilioVideo.connect(with: connectOptions, delegate: self)
    self.showRoomUI(inRoom: true)
}

func prepareLocalMedia() {
if (localAudioTrack == nil) {
localAudioTrack = TVILocalAudioTrack.init(options: nil, enabled: true, name: "mic")

    }
    if (localVideoTrack == nil) {
        self.setupPreview()
    }
}

@Girish-iweb
Copy link

func setupPreview()
{
let frontCamera = TVICameraSource.captureDevice(for: .front)
let backCamera = TVICameraSource.captureDevice(for: .back)
if (frontCamera != nil || backCamera != nil) {
// Preview our local camera track in the local video preview view.
let cameraSourceOptions = TVICameraSourceOptions.init { (builder) in
builder.enablePreview = true
}
camera = TVICameraSource(options: cameraSourceOptions, delegate: self)
// camera = TVICameraSource(delegate: self)
localVideoTrack = TVILocalVideoTrack.init(source: camera!, enabled: true, name: "Camera")
// Add renderer to video track for local preview
localVideoTrack!.addRenderer(self.previewView)
camera!.startCapture(with: frontCamera != nil ? frontCamera! : backCamera!) { (captureDevice, videoFormat, error) in
if let error = error {

                APIManager.sharedInstance.centerToast(messsage: "Error \(error.localizedDescription)", view: self.view)
            } else {
                self.previewView.shouldMirror = (captureDevice.position == .front)
            }
        }
    }
    else {
        
        APIManager.sharedInstance.centerToast(messsage: "No front or back capture device found", view: self.view)
    }
}

@piyushtank
Copy link
Contributor

@Girish-iweb do you think the the issue you are reporting related to iOS max encoders limitation issue? if not, please open a seperate issue.
Also if you are having problem in media issues in Video Call, please use our Quickstart example on how to set up renderer and camera.

@Girish-iweb
Copy link

Girish-iweb commented Apr 2, 2019

@piyushtank
I am using Quickstart example step by step but not hearing audio of room participated
using peer to peer video call connection
when I have created room I have hearing participated audio and video but not hearing my audio on other participants

bellow my code

self.prepareLocalMedia()

let connectOptions = TVIConnectOptions.init(token: accessToken) { (builder) in
      if let videoTrack = self.localVideoTrack {
        builder.videoTracks = [videoTrack]
    }
    if let audioTrack = self.localAudioTrack {
        builder.audioTracks = [audioTrack]
    }

    if let preferredAudioCodec = Settings.shared.audioCodec {
        builder.preferredAudioCodecs = [preferredAudioCodec]
    }

    // Use the preferred video codec
    if let preferredVideoCodec = Settings.shared.videoCodec {
        builder.preferredVideoCodecs = [preferredVideoCodec]
    }

    // Use the preferred encoding parameters
    if let encodingParameters = Settings.shared.getEncodingParameters() {
        builder.encodingParameters = encodingParameters
    }

    builder.roomName = self.roomName
}

// APIManager.sharedInstance.centerToast(messsage: "Connecting...", view:self.view)
room = TwilioVideo.connect(with: connectOptions, delegate: self)

@kierandonaldson kierandonaldson closed this as not planned Won't fix, can't repro, duplicate, stale May 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants