Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
393 views
in Technique[技术] by (71.8m points)

ios4 - What's the best way of live streaming iphone camera to a media server?

According to this What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer? is possible to get compressed data from iphone camera, but as I've been reading in the AVFoundation reference you only get uncompressed data.

So the questions are:

1) How to get compressed frames and audio from iPhone's camera?

2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?

Any help will be really appreciated.

Thanks.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You most likely already know....

1) How to get compressed frames and audio from iPhone's camera?

You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.

2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?

Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.

I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...