With the release of iOS 14.3, Apple has made available the camera and microphone in WKWebView. This now allows applications with a focus on web content as well as other browsers on iOS to leverage WebRTC (amongst other things).
To use the camera and microphone, applications must declare NSCameraUsageDescription and NSMicrophoneUsageDescription in the Info.plist and request permission using AVFoundation. Once that is done, you are free to use the camera and microphone in web content with the right configuration....
When embedding web content in iOS applications, it is necessary to become the delegate of the web view displaying the content so that you can handle errors during the loading process. Depending on the UI you have implemented around said web view(s), you might also have navigation controls as well as a cancel/reload button. For UIWebView, the webView:didFailLodWithError: method is used to tell the delegate that some error occurred during page load, whereas WKWebView uses both webView:didFailNavigation:withError & webView:didFailProvisionalNavigation:withError: to inform the delegate....
Up until iOS 8 (2014), displaying web content in an application required you to either launch out to Safari or use UIWebView. The benefit of using UIWebView was the the user did not have to leave your application, but the downside was that UIWebView was significantly underpowered when it came to Safari as it had (and still does) the benefit of a more modern rendering and JavaScript engine, so UIWebView content did not perform as well....