Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
427 views
in Technique[技术] by (71.8m points)

serialization - Objective-c - How to serialize audio file into small packets that can be played?

So, I would like to get a sound file and convert it in packets, and send it to another computer. I would like that the other computer be able to play the packets as they arrive.

I am using AVAudioPlayer to try to play this packets, but I couldn't find a proper way to serialize the data on the peer1 that the peer2 can play.

The scenario is, peer1 has a audio file, split the audio file in many small packets, put them on a NSData and send them to peer2. Peer 2 receive the packets and play one by one, as they arrive.

Does anyone have know how to do this? or even if it is possible?

EDIT:

Here it is some piece of code to illustrate what I would like to achieve.


// This code is part of the peer1, the one who sends the data
- (void)sendData
{
    int packetId = 0;
    NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:@"myAudioFile" ofType:@"wav"];

    NSData *soundData = [[NSData alloc] initWithContentsOfFile:soundFilePath];
    NSMutableArray *arraySoundData = [[NSMutableArray alloc] init];

    // Spliting the audio in 2 pieces
    // This is only an illustration
    // The idea is to split the data into multiple pieces
    // dependin on the size of the file to be sent
    NSRange soundRange;
    soundRange.length = [soundData length]/2;
    soundRange.location = 0;
    [arraySoundData addObject:[soundData subdataWithRange:soundRange]];
    soundRange.length = [soundData length]/2;
    soundRange.location = [soundData length]/2;
    [arraySoundData addObject:[soundData subdataWithRange:soundRange]];

    for (int i=0; i<[arraySoundData count]; i++)
    {
        NSData *soundPacket = [arraySoundData objectAtIndex:i];

        if(soundPacket == nil)
        {
            NSLog(@"soundData is nil");
            return;
        }       

        NSMutableData* message = [[NSMutableData alloc] init];
        NSKeyedArchiver* archiver = [[NSKeyedArchiver alloc] initForWritingWithMutableData:message];
        [archiver encodeInt:packetId++ forKey:PACKET_ID];
        [archiver encodeObject:soundPacket forKey:PACKET_SOUND_DATA];
        [archiver finishEncoding];      

        NSError* error = nil;
        [connectionManager sendMessage:message error:&error];
        if (error) NSLog (@"send greeting failed: %@" , [error localizedDescription]);

        [message release];
        [archiver release];
    }

    [soundData release];
    [arraySoundData release];
}

// This is the code on peer2 that would receive and play the piece of audio on each packet

- (void) receiveData:(NSData *)data
{

    NSKeyedUnarchiver* unarchiver = [[NSKeyedUnarchiver alloc] initForReadingWithData:data];

    if ([unarchiver containsValueForKey:PACKET_ID])
        NSLog(@"DECODED PACKET_ID: %i", [unarchiver decodeIntForKey:PACKET_ID]);

    if ([unarchiver containsValueForKey:PACKET_SOUND_DATA])
    {
        NSLog(@"DECODED sound");
        NSData *sound = (NSData *)[unarchiver decodeObjectForKey:PACKET_SOUND_DATA];

        if (sound == nil)
        {
            NSLog(@"sound is nil!");

        }
        else
        {
            NSLog(@"sound is not nil!");

            AVAudioPlayer *audioPlayer = [AVAudioPlayer alloc];

            if ([audioPlayer initWithData:sound error:nil])
            {
                [audioPlayer prepareToPlay];
                [audioPlayer play];
            } else {
                [audioPlayer release];
                NSLog(@"Player couldn't load data");
            }   
        }
    }

    [unarchiver release];
}

So, here is what I am trying to achieve...so, what I really need to know is how to create the packets, so peer2 can play the audio.

It would be a kind of streaming. Yes, for now I am not worried about the order that the packet are received or played...I only need to get the sound sliced and them be able to play each piece, each slice, without need to wait for the whole file be received by peer2.

Thanks!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Here is simplest class to play files with AQ Note that you can play it from any point (just set currentPacketNumber)

#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>

@interface AudioFile : NSObject {
    AudioFileID                     fileID;     // the identifier for the audio file to play
    AudioStreamBasicDescription     format;
    UInt64                          packetsCount;           
    UInt32                          maxPacketSize;  
}

@property (readwrite)           AudioFileID                 fileID;
@property (readwrite)           UInt64                      packetsCount;
@property (readwrite)           UInt32                      maxPacketSize;

- (id) initWithURL: (CFURLRef) url;
- (AudioStreamBasicDescription *)audioFormatRef;

@end


//  AudioFile.m

#import "AudioFile.h"


@implementation AudioFile

@synthesize fileID;
@synthesize format;
@synthesize maxPacketSize;
@synthesize packetsCount;

- (id)initWithURL:(CFURLRef)url{
    if (self = [super init]){       
        AudioFileOpenURL(
                         url,
                         0x01, //fsRdPerm, read only
                         0, //no hint
                         &fileID
                         );

        UInt32 sizeOfPlaybackFormatASBDStruct = sizeof format;
        AudioFileGetProperty (
                              fileID, 
                              kAudioFilePropertyDataFormat,
                              &sizeOfPlaybackFormatASBDStruct,
                              &format
                              );

        UInt32 propertySize = sizeof (maxPacketSize);

        AudioFileGetProperty (
                              fileID, 
                              kAudioFilePropertyMaximumPacketSize,
                              &propertySize,
                              &maxPacketSize
                              );

        propertySize = sizeof(packetsCount);
        AudioFileGetProperty(fileID, kAudioFilePropertyAudioDataPacketCount, &propertySize, &packetsCount);
    }
    return self;
} 

-(AudioStreamBasicDescription *)audioFormatRef{
    return &format;
}

- (void) dealloc {
    AudioFileClose(fileID);
    [super dealloc];
}



//  AQPlayer.h

#import <Foundation/Foundation.h>
#import "AudioFile.h"

#define AUDIOBUFFERS_NUMBER     3
#define MAX_PACKET_COUNT    4096

@interface AQPlayer : NSObject {
@public
    AudioQueueRef                   queue;
    AudioQueueBufferRef             buffers[AUDIOBUFFERS_NUMBER];
    NSInteger                       bufferByteSize;
    AudioStreamPacketDescription    packetDescriptions[MAX_PACKET_COUNT];

    AudioFile * audioFile;
    SInt64  currentPacketNumber;
    UInt32  numPacketsToRead;
}

@property (nonatomic)               SInt64          currentPacketNumber;
@property (nonatomic, retain)       AudioFile       * audioFile;

-(id)initWithFile:(NSString *)file;
-(NSInteger)fillBuffer:(AudioQueueBufferRef)buffer;
-(void)play;

@end 

//  AQPlayer.m

#import "AQPlayer.h"

static void AQOutputCallback(void * inUserData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer) {
    AQPlayer * aqp = (AQPlayer *)inUserData;
    [aqp fillBuffer:(AudioQueueBufferRef)inBuffer];
}

@implementation AQPlayer

@synthesize currentPacketNumber;
@synthesize audioFile;

-(id)initWithFile:(NSString *)file{
    if ([self init]){
        audioFile = [[AudioFile alloc] initWithURL:[NSURL fileURLWithPath:file]];
        currentPacketNumber = 0;
        AudioQueueNewOutput ([audioFile audioFormatRef], AQOutputCallback, self, CFRunLoopGetCurrent (), kCFRunLoopCommonModes, 0, &queue);
        bufferByteSize = 4096;
        if (bufferByteSize < audioFile.maxPacketSize) bufferByteSize = audioFile.maxPacketSize; 
        numPacketsToRead = bufferByteSize/audioFile.maxPacketSize;
        for(int i=0; i<AUDIOBUFFERS_NUMBER; i++){
            AudioQueueAllocateBuffer (queue, bufferByteSize, &buffers[i]);
        }
    }
    return self;
}

-(void) dealloc{
    [audioFile release];
    if (queue){
        AudioQueueDispose(queue, YES);
        queue = nil;
    }
    [super dealloc];
}

- (void)play{
    for (int bufferIndex = 0; bufferIndex < AUDIOBUFFERS_NUMBER; ++bufferIndex){
        [self fillBuffer:buffers[bufferIndex]];
    }
    AudioQueueStart (queue, NULL);

}

-(NSInteger)fillBuffer:(AudioQueueBufferRef)buffer{
    UInt32 numBytes;
    UInt32 numPackets = numPacketsToRead;
    BOOL isVBR = [audioFile audioFormatRef]->mBytesPerPacket == 0 ? YES : NO;
    AudioFileReadPackets(
                         audioFile.fileID,
                         NO,
                         &numBytes,
                         isVBR ? packetDescriptions : 0,
                         currentPacketNumber,
                         &numPackets, 
                         buffer->mAudioData
                         );

    if (numPackets > 0) {
        buffer->mAudioDataByteSize = numBytes;      
        AudioQueueEnqueueBuffer (
                                 queue,
                                 buffer,
                                 isVBR ? numPackets : 0,
                                 isVBR ? packetDescriptions : 0
                                 );


    } 
    else{
        // end of present data, check if all packets are played
        // if yes, stop play and dispose queue
        // if no, pause queue till new data arrive then start it again
    }
    return  numPackets;
}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

2.1m questions

2.1m answers

60 comments

57.0k users

...