Safari User Guide
Now, let's make tests for the main use cases. We start with a player. First of all, we need to install the latest iOS 11.0.2 with new Safari. So, as the first test, we want Chrome for Windows to broadcast a video stream to the server, and a spectator on iOS Safari should play this video stream via WebRTC. IOS 14.2: Apple has made the release candidate available to beta testers Apple skipped past iOS 14.1, which adds support for the new iPhone 12. Updated Oct 30, 2020 9:24 AM By Macworld staff. Apple's Safari was a close second, although we'll need to revisit once Big Sur comes out of beta. Security and privacy If you're using a Mac, chances are you care about security and privacy. Now, let's make tests for the main use cases. We start with a player. First of all, we need to install the latest iOS 11.0.2 with new Safari. So, as the first test, we want Chrome for Windows to broadcast a video stream to the server, and a spectator on iOS Safari should play this video stream via WebRTC.
You can change options in Safari preferences so that Safari always accepts or always blocks cookies and website data.
In the Safari app on your Mac, choose Safari > Preferences, click Privacy, then do any of the following:
Prevent trackers from using cookies and website data to track you: Select 'Prevent cross-site tracking.'
Cookies and website data are deleted unless you visit and interact with the trackers' websites.
Always block cookies: Select 'Block all cookies.'
Websites, third parties, and advertisers can't store cookies and other data on your Mac. This may prevent some websites from working properly.
Always allow cookies: Deselect 'Block all cookies.'
Websites, third parties, and advertisers can store cookies and other data on your Mac.
Remove stored cookies and data: Click Manage Website Data, select one or more websites, then click Remove or Remove All.
Removing the data may reduce tracking, but may also log you out of websites or change website behavior.
See which websites store cookies or data: Click Manage Website Data.
Note: Changing your cookie preferences or removing cookies and website data in Safari may change or remove them in other apps.
A week ago, new iPhones were released along with iOS 11 – a notable event. This release among everything else has brought one more important thing to developers: the Safari browser received long-awaited support for WebRTC.
Think for a minute: millions of iPhones and iPads all over the world suddenly learned to play real-time audio and video in a browser. iOS and Mac users can now enjoy full-functional in-browser video chats, live broadcasts with low (less than a second) real-time latency, calls and conferences and more. The road was long, and now we are here.
Before
Previously, we wrote about a way to play video with minimum latency in iOS Safari, and this method is still actual for iOS 9 and iOS 10 that lack support for WebRTC. We suggested to use the approach, the code name 'WSPlayer', that allows for delivering a live video stream via the Websocket protocol, then decode this stream using JavaScript and render the video stream onto the Canvas HTML5 element using WebGL. The received audio stream had to be played using Web Audio API of the browser. Here is how it looked:
This approach allowed and still allows to play a stream on a page in the iOS Safari browser with overall latency of about 3 seconds, but has its disadvantages:
1. Performance.
A video stream is decoded using JavaScript. This creates high CPU load of the mobile device, prevents from playing higher resolutions and consumes charge of the battery.
2. TCP.
A transport protocol video and audio are transmitted is Websocket / TCP. Due to this, latency cannot be targeted to a specific value, and still can increase if any network fluctuations occur.
All the time until iOS 11 was released, WSPlayer could play video with relatively low latency (3 seconds) compared to HLS (20 seconds). Now things got better, and the JavaScript player gives place to the native WebRTC technology, that does all the job using means of the browser itself without JavaScript decoding or using Canvas.
Now
With arrival of WebRTC, playing low latency video in iOS Safari 11 became identical to other browsers supporting WebRTC, namely Chrome, Firefox, Edge.
Microphone and camera
Above, we said only about playing of real-time video. But you cannot run a video chat without a camera and a microphone. And this was a real headache for developers planning to add support for iOS Safari in their video chats or other live video projects. Thousands of manhours were wasted while searching for a solution in iOS Safari 9 and 10 that simply did not exist – Safari couldn't capture the camera or the microphone, and this 'feature' was fixed no sooner than in iOS 11.
Run iOS 11 Safari and request access to the camera and the microphone. Now, this is what we've been waiting for. The wait is over:
The browser asks for the camera and the microphone and now can both stream live video and play audio and video.
Also, you can take a look at Safari settings and turn on/off the microphone there:
Camera display and playing of a streaming video
Surely, there are specifics. The most notable feature is that the video in the video element must be tapped (clicked) before it starts to play.
For developers, this is a limitation and a showstopper. Indeed, if a customer insists 'I want this video to play automatically on load', in iOS Safari this trick will not work, so the developer will have to explain that this is the fault of Safari and Apple's F-ing security policy.
For users, however, this may be good, because websites will not be able to play a video stream without explicit will of a user who confirms his will to play the video be clicking the element.
What about Mac OS?
Here are some good news for Macbook and Mac OS owners. Use iphone to unlock mac. After the update, Safari 11 on Mac also supports WebRTC. Previously, Mac Safari used old reliable Flash Player that did work as a cheap replacement for WebRTC: it compressed and played audio and video via RTMP and RTMFP. But now as WebRTC is available, there is no need to use Flash Player for video chats anymore. So, we use WebRTC for Safari 11+ and continue using Flash Player or WebRTC plugins as a fallback mechanism in Safari 10.
Summary
As you can see, Safari 11 got support for WebRTC, while Safari 9 and 10 remained with fallbacks like Fash Player and WebRTC plugins on Mac OS, as well as WSPlayer on iOS.
Mac, Safari 10 | iOS 9, 10, Safari | Mac, Safari 11 | iOS 11, Safari |
Flash Player WebRTC plugins WSPlayer | WSPlayer | WebRTC | WebRTC |
Testing browser – to-browser broadcasting
Now, let's make tests for the main use cases. We start with a player. First of all, we need to install the latest iOS 11.0.2 with new Safari.
So, as the first test, we want Chrome for Windows to broadcast a video stream to the server, and a spectator on iOS Safari should play this video stream via WebRTC.
Open the Two Way Streaming example in the Chrome browser and send a WebRTC video stream called 1ad5 to the server. https://onpepz.over-blog.com/2021/01/mawatermarker-1-5-1-download-free.html. Chrome captures the video from the camera, encodes it to H.264 in our case and sends the live video stream to the server for further sharing. Video stream broadcasting looks as follows:
To play, specify the name of the video stream, and the player in iOS Safari starts playing the stream sent by Chrome to the server. Playing the stream on iPhone in the Safari browser looks like this
Latency is hardly visible (less than a second). The video stream plays smoothly and without artifacts. Quality of playback is good as you can see on the screenshots.
And here is how video playback of the same example Two Way Streaming looks in the Play block. Therefore, you can broadcast one stream and play another one on the same page in the browser. If users know each other's stream names, that's a simple video chat.
Testing web camera and microphone broadcasting using iOS Safari
As we mentioned above, the main feature of WebRTC is its ability to capture the camera and the microphone in a browser and to send it to the network with a target low latency. Let's see if it works in iOS Safari 11.
Open in Safari the same example of the demo streamer we opened in Chrome. Receive access to 'Microphone and camera'. Safari shows a dialog where you should either allow or disallow using the camera and the microphone.
After we've got access to the camera and the microphone, we should see the red icon of the camera in the top left corner of the browser. Safari indicates the camera is active and is in use. The video stream is being sent to the server.
We fetch this stream in another browser, for example, Chrome. On the playback, we see the stream sent from Safari with infamous vertical shooting.
After the iPhone was turned horizontally, the video streaming picture becomes normal:
Ios Safari Downloads
Video capturing and broadcasting of a video are always more interesting than mere playback, because the most important things happen here including RTCP feedbacks that set the target for latency and quality of the video.
At the moment this article was written we didn't find any suitable tools to monitor WebRTC in a browser for iOS Safari, like the webrtc-internals tool for Chrome. Let's see how the server sees the video stream captured from Safari. To do this, we enable monitoring and take a look at the main graphs describing the traffic coming from Safari.
The first set of graphs displays such metrics as NACK and PLI that indicate loss of UDP packets. For a normal quality network NACK shown on the graphs is considered low, near 15, so we can conclude the patient is well enough.
FPS of the video stream stays near 29,30,31 and never goes down to lower values (10-15). This means that performance of hardware acceleration of iPhone is enough to encode the video to the H.264 codec, and the processor power is enough to stream this video to the network. For this test we used iPhone 6, 16 GB.
The following graphs display how the resolution and bitrate of the video change over time. The video bitrate varies from 1.2 – 1.6 Mbps, and the resolution stays the same: 640×480. This means the bandwidth is sufficient to encode video and Safari compresses the video with the maximum bitrate. Optionally, you can put the bitrate in certain limits.
Then we check the bitrate of the audio part of the stream and statistics of audio losses. We can see that there are no lost packets of audio, the counter is strictly zero. The audio bitrate is 30-34 kbps. This is the Opus codec Safari uses to compress the audio stream captured from the microphone.
Finally, the last graphs are timecodes. Timecodes allow evaluating if video and audio are synchronized. Lack of synchronization leads to notable discrepancy: the voice lags the lips, or goes ahead the video. In our case the stream from Safari is ideally synchronized and goes monotonously without any deviations.
From these graphs we can see behavior typical for WebRTC and very similar to the behavior of Google Chrome: NACK and PLI feedbacks comes, FPS changes only slightly, the bitrate is varying. In other words, we've got the WebRTC we've been waiting for.
Please note changes of width and height. For example, if we change orientation to horizontal, resolution of the stream reverses from 640×480 to 480×640, like shown below.
The orange line here is the width, and the cyan line is the height of the image. At 05:21:17 we turned the iPhone that performs streaming horizontally and the resolution of the stream changed accordingly: width 480 and height 640.
Testing video playback from an IP-camera using WebRTC for iOS Safari
An IP-camera is usually a portable Linux server that sends streams via the RTSP protocol. In this test we fetch the video from the IP camera that supports H.264 and play this video in the iOS Safari browser via WebRTC. To do this, we enter the RTSP address of the stream instead of its name in the player we used before.
Playing a stream from the IP-camera in Safari via WebRTC looks as follows:
In this case the video plays smoothly without any problems or glitches. However, the source of the stream has significant effect on playback. Depending on how the video goes from the IP-camera to the server, things can look different.
As a result, we successfully tested three cases:
- Broadcasting from the Chrome browser to Safari
- Capturing of the camera and the microphone and broadcasting from Safari to Chrome
- Playing video from an IP-camera in iOS Safari
A few words about the code
Mobile Safari On Ios 8
Mac studio fix powder foundation nw25. To broadcast video streams we use the universal API (Web SDK), that for broadcasting purposes looks like this:
Here, we set a unique name of the stream stream22 and use the div-element
to display the captured camera on the web page.
Playing the same stream in a browser works as folllows:
That is, we set the name of the stream and specify the div-element to play this video in. Then, we call the play() method.
iOS Safari is currently the only browser that requires clicking the element before the video starts playing.
So we added a simple code specially for iOS Safari that 'activates' the video element before playing the stream by executing the following code:
This code is executed in the standard player upon clicking the play button, so we fulfill Apple's security requirements and correctly start playing the video.
In conclusion
iOS 11 Safari has finally received support for WebRTC and this support isn't going to be removed in further releases. So we can use this possibility to create real-time streaming video and browser calls. Install further iOS 11.x updates and wait for new fixes and bugs. Good streaming!
Links
WCS – the server we used to test broadcasting on iOS 11 Safari
Two Way Streaming – the example of a streamer
Source Two Way Streaming – streamer sources
Player– the example of a player
Source Player – player sources
Updating Safari On Ios
WSPlayer – playing low latency video streams in iOS 9, 10 Safari