1

我正在用 Objective C 录制一个声音文件(wav 格式)。我想使用 Objective C stringByEvaluatingJavaScriptFromString将它传递回 Javascript 。我在想我必须将 wav 文件转换为 base64 字符串才能将其传递给这个函数。然后我将不得不在 javascript 中将 base64 字符串转换回 (wav/blob) 格式,以将其传递给音频标签以播放它。我不知道我该怎么做?也不确定这是否是将wave文件传回javascript的最佳方法?任何想法将不胜感激。

4

2 回答 2

2

好吧,这并不像我预期的那样直截了当。所以这就是我能够做到这一点的方法。

第 1 步:我使用 AudioRecorder 以 caf 格式录制音频。

NSArray *dirPaths;
NSString *docsDir;

dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

docsDir = [dirPaths objectAtIndex:0];

soundFilePath = [docsDir stringByAppendingPathComponent:@"sound.caf"];

NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];

NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSNumber numberWithInt:AVAudioQualityMin],
    AVEncoderAudioQualityKey,
    [NSNumber numberWithInt:16],
    AVEncoderBitRateKey,
    [NSNumber numberWithInt:2],
    AVNumberOfChannelsKey,
    [NSNumber numberWithFloat:44100],
                                AVSampleRateKey,
    nil];

NSError *error = nil;

audioRecorder = [[AVAudioRecorder alloc]
                 initWithURL:soundFileURL
                 settings:recordSettings error:&error];

if(error)
{
    NSLog(@"error: %@", [error localizedDescription]);
} else {
    [audioRecorder prepareToRecord];
}

在此之后,您只需要调用 audioRecorder.record 来录制音频。它将以caf格式记录。如果你想看我的 recordAudio 功能,那就在这里。

  (void) recordAudio
   {
    if(!audioRecorder.recording)
     {
         _playButton.enabled = NO;
         _recordButton.title = @"Stop";
         [audioRecorder record];
         [self animate1:nil finished:nil context:nil];

     }
    else
    {
       [_recordingImage stopAnimating];
       [audioRecorder stop];
       _playButton.enabled = YES;
      _recordButton.title = @"Record";
    }
  }

第 2 步:将 caf 格式转换为 wav 格式。我可以使用以下功能执行此操作。

 -(BOOL)exportAssetAsWaveFormat:(NSString*)filePath
{
   NSError *error = nil ;

NSDictionary *audioSetting = [NSDictionary dictionaryWithObjectsAndKeys:
                              [ NSNumber numberWithFloat:44100.0], AVSampleRateKey,
                              [ NSNumber numberWithInt:2], AVNumberOfChannelsKey,
                              [ NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
                              [ NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
                              [ NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
                              [ NSNumber numberWithBool:0], AVLinearPCMIsBigEndianKey,
                              [ NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
                              [ NSData data], AVChannelLayoutKey, nil ];

NSString *audioFilePath = filePath;
AVURLAsset * URLAsset = [[AVURLAsset alloc]  initWithURL:[NSURL fileURLWithPath:audioFilePath] options:nil];

if (!URLAsset) return NO ;

AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:URLAsset error:&error];
if (error) return NO;

NSArray *tracks = [URLAsset tracksWithMediaType:AVMediaTypeAudio];
if (![tracks count]) return NO;

AVAssetReaderAudioMixOutput *audioMixOutput = [AVAssetReaderAudioMixOutput
                                               assetReaderAudioMixOutputWithAudioTracks:tracks
                                               audioSettings :audioSetting];

if (![assetReader canAddOutput:audioMixOutput]) return NO ;

[assetReader addOutput :audioMixOutput];

if (![assetReader startReading]) return NO;



NSString *title = @"WavConverted";
NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docDir = [docDirs objectAtIndex: 0];
NSString *outPath = [[docDir stringByAppendingPathComponent :title]
                     stringByAppendingPathExtension:@"wav" ];

if(![[NSFileManager defaultManager] removeItemAtPath:outPath error:NULL])
{
    return NO;
}

soundFilePath = outPath;

NSURL *outURL = [NSURL fileURLWithPath:outPath];
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:outURL
                                                      fileType:AVFileTypeWAVE
                                                         error:&error];
if (error) return NO;

AVAssetWriterInput *assetWriterInput = [ AVAssetWriterInput assetWriterInputWithMediaType :AVMediaTypeAudio
                                                                            outputSettings:audioSetting];
assetWriterInput. expectsMediaDataInRealTime = NO;

if (![assetWriter canAddInput:assetWriterInput]) return NO ;

[assetWriter addInput :assetWriterInput];

if (![assetWriter startWriting]) return NO;


//[assetReader retain];
//[assetWriter retain];

[assetWriter startSessionAtSourceTime:kCMTimeZero ];

dispatch_queue_t queue = dispatch_queue_create( "assetWriterQueue", NULL );

[assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{

    NSLog(@"start");

    while (1)
    {
        if ([assetWriterInput isReadyForMoreMediaData] && (assetReader.status == AVAssetReaderStatusReading)) {

            CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer];

            if (sampleBuffer) {
                [assetWriterInput appendSampleBuffer :sampleBuffer];
                CFRelease(sampleBuffer);
            } else {
                [assetWriterInput markAsFinished];
                break;
            }
        }
    }

    [assetWriter finishWriting];

    //[self playWavFile];
    NSError *err;
    NSData *audioData = [NSData dataWithContentsOfFile:soundFilePath options: 0 error:&err];
    [self.audioDelegate doneRecording:audioData];
    //[assetReader release ];
    //[assetWriter release ];
    NSLog(@"soundFilePath=%@",soundFilePath);
    NSDictionary *dict = [[NSFileManager defaultManager] attributesOfItemAtPath:soundFilePath error:&err];
    NSLog(@"size of wav file = %@",[dict objectForKey:NSFileSize]);
    //NSLog(@"finish");
}];

在这个函数中,我使用 wav 格式的 audioData 调用 audioDelegate 函数 doneRecording。这是 doneRecording 的代码。

-(void) doneRecording:(NSData *)contents
{
myContents = [[NSData dataWithData:contents] retain];
[self returnResult:alertCallbackId args:@"Recording Done.",nil];
}

// Call this function when you have results to send back to javascript callbacks
 // callbackId : int comes from handleCall function

// args: list of objects to send to the javascript callback
- (void)returnResult:(int)callbackId args:(id)arg, ...;
{
  if (callbackId==0) return;

  va_list argsList;
  NSMutableArray *resultArray = [[NSMutableArray alloc] init];

  if(arg != nil){
    [resultArray addObject:arg];
    va_start(argsList, arg);
    while((arg = va_arg(argsList, id)) != nil)
      [resultArray addObject:arg];
    va_end(argsList);
  }

   NSString *resultArrayString = [json stringWithObject:resultArray allowScalar:YES error:nil];
   [self performSelectorOnMainThread:@selector(stringByEvaluatingJavaScriptFromString:) withObject:[NSString stringWithFormat:@"NativeBridge.resultForCallback(%d,%@);",callbackId,resultArrayString] waitUntilDone:NO];
   [resultArray release];    
}

第 3 步:现在是时候与 UIWebView 中的 javascript 进行通信了,我们已经完成了音频的录制,这样您就可以开始接受我们的数据块了。我正在使用 websockets 将数据传输回 javascript。数据将以块的形式传输,因为我使用的服务器(https://github.com/benlodotcom/BLWebSocketsServer)是使用 libwebsockets 构建的(http://git.warmcat.com/cgi-bin/cgit/libwebsockets/)。

这就是您在委托类中启动服务器的方式。

- (id)initWithFrame:(CGRect)frame 
{
  if (self = [super initWithFrame:frame]) {

      [self _createServer];
      [self.server start];
      myContents = [NSData data];

    // Set delegate in order to "shouldStartLoadWithRequest" to be called
    self.delegate = self;

    // Set non-opaque in order to make "body{background-color:transparent}" working!
    self.opaque = NO;

    // Instanciate JSON parser library
    json = [ SBJSON new ];

    // load our html file
    NSString *path = [[NSBundle mainBundle] pathForResource:@"webview-document" ofType:@"html"];
    [self loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:path]]];



  }
  return self;
}
-(void) _createServer
{
    /*Create a simple echo server*/
    self.server = [[BLWebSocketsServer alloc] initWithPort:9000 andProtocolName:echoProtocol];
    [self.server setHandleRequestBlock:^NSData *(NSData *data) {

        NSString *convertedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
        NSLog(@"Received Request...%@",convertedString);

        if([convertedString isEqualToString:@"start"])
        {
            NSLog(@"myContents size: %d",[myContents length]);

            int contentSize = [myContents length];
            int chunkSize = 64*1023;
            chunksCount = ([myContents length]/(64*1023))+1;

            NSLog(@"ChunkSize=%d",chunkSize);
            NSLog(@"chunksCount=%d",chunksCount);

            chunksArray =  [[NSMutableArray array] retain];

            int index = 0;
            //NSRange chunkRange;

            for(int i=1;i<=chunksCount;i++)
            {

                if(i==chunksCount)
                {
                    NSRange chunkRange = {index,contentSize-index};
                    NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,contentSize-index);
                    NSData *dataChunk = [myContents subdataWithRange:chunkRange];
                    [chunksArray addObject:dataChunk];
                    break;
                }
                else
                {
                    NSRange chunkRange = {index, chunkSize};
                    NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,chunkSize);
                    NSData *dataChunk = [myContents subdataWithRange:chunkRange];
                    index += chunkSize;
                    [chunksArray addObject:dataChunk];
                }
            }

            return [chunksArray objectAtIndex:0];

        }
        else
        {
            int chunkNumber = [convertedString intValue];

            if(chunkNumber>0 && (chunkNumber+1)<=chunksCount)
            {
                return [chunksArray objectAtIndex:(chunkNumber)];
            }


        }

        NSLog(@"Releasing Array");
        [chunksArray release];
        chunksCount = 0;
        return [NSData dataWithBase64EncodedString:@"Stop"];
    }];
}

javascript端的代码是

var socket;
var chunkCount = 0;
var soundBlob, soundUrl;
var smallBlobs = new Array();

function captureMovieCallback(response)
{
    if(socket)
    {
        try{
            socket.send('start');
        }
        catch(e)
        {
            log('Socket is not valid object');
        }

    }
    else
    {
        log('socket is null');
    }
}

function closeSocket(response)
{
    socket.close();
}


function connect(){
    try{
        window.WebSocket = window.WebSocket || window.MozWebSocket;

        socket = new WebSocket('ws://127.0.0.1:9000',
                                      'echo-protocol');

        socket.onopen = function(){
        }

        socket.onmessage = function(e){
            var data = e.data;
            if(e.data instanceof ArrayBuffer)
            {
                log('its arrayBuffer');
            }
            else if(e.data instanceof Blob)
            {
                if(soundBlob)
                   log('its Blob of size = '+ e.data.size + ' final blob size:'+ soundBlob.size);

                if(e.data.size != 3)
                {
                    //log('its Blob of size = '+ e.data.size);
                    smallBlobs[chunkCount]= e.data;
                    chunkCount = chunkCount +1;
                    socket.send(''+chunkCount);
                }
                else
                {
                    //alert('End Received');
                    try{
                    soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });
                    var myURL = window.URL || window.webkitURL;
                    soundUrl = myURL.createObjectURL(soundBlob);
                    log('soundURL='+soundUrl);
                    }
                    catch(e)
                    {
                        log('Problem creating blob and url.');
                    }

                    try{
                        var serverUrl = 'http://10.44.45.74:8080/MyTestProject/WebRecording?record';
                        var xhr = new XMLHttpRequest();
                        xhr.open('POST',serverUrl,true);
                        xhr.setRequestHeader("content-type","multipart/form-data");
                        xhr.send(soundBlob);
                    }
                    catch(e)
                    {
                        log('error uploading blob file');
                    }

                    socket.close();
                }

                //alert(JSON.stringify(msg, null, 4));
            }
            else
            {
                log('dont know');
            }
        }

        socket.onclose = function(){
            //message('<p class="event">Socket Status: '+socket.readyState+' (Closed)');
            log('final blob size:'+soundBlob.size);
        }

    } catch(exception){
       log('<p>Error: '+exception);
    }
}

function log(msg) {
    NativeBridge.log(msg);
}
function stopCapture() {
    NativeBridge.call("stopMovie", null,null);
}

function startCapture() {
    NativeBridge.call("captureMovie",null,captureMovieCallback);
}

NativeBridge.js

var NativeBridge = {
  callbacksCount : 1,
  callbacks : {},

  // Automatically called by native layer when a result is available
  resultForCallback : function resultForCallback(callbackId, resultArray) {
    try {


    var callback = NativeBridge.callbacks[callbackId];
    if (!callback) return;
    console.log("calling callback for "+callbackId);
    callback.apply(null,resultArray);
    } catch(e) {alert(e)}
  },

  // Use this in javascript to request native objective-c code
  // functionName : string (I think the name is explicit :p)
  // args : array of arguments
  // callback : function with n-arguments that is going to be called when the native code returned
  call : function call(functionName, args, callback) {

    //alert("call");
    //alert('callback='+callback);
    var hasCallback = callback && typeof callback == "function";
    var callbackId = hasCallback ? NativeBridge.callbacksCount++ : 0;

    if (hasCallback)
      NativeBridge.callbacks[callbackId] = callback;

    var iframe = document.createElement("IFRAME");
    iframe.setAttribute("src", "js-frame:" + functionName + ":" + callbackId+ ":" + encodeURIComponent(JSON.stringify(args)));
    document.documentElement.appendChild(iframe);
    iframe.parentNode.removeChild(iframe);
    iframe = null;

  },

    log : function log(message) {

        var iframe = document.createElement("IFRAME");
        iframe.setAttribute("src", "ios-log:"+encodeURIComponent(JSON.stringify("#iOS#" + message)));
        document.documentElement.appendChild(iframe);
        iframe.parentNode.removeChild(iframe);
        iframe = null;

    }

};
  1. 我们在 html 端的 body 负载上调用 javascript 端的 connect()

  2. 一旦我们从 startCapture 函数接收到回调(captureMovieCallback),我们就会发送开始消息,表明我们已准备好接受数据。

  3. 目标 c 端的服务器将 wav 音频数据拆分为 chunksize=60*1023 的小块并存储在数组中。

  4. 将第一个块发送回 javascript 端。

  5. javascript 接受这个块并从服务器发送它需要的下一个块的数量。

  6. 服务器发送由该数字指示的块。重复此过程,直到我们将最后一个块发送到 javascript。

  7. 最后,我们将停止消息发送回 javascript 端,表明我们已完成。它的大小显然是 3 个字节(用作打破此循环的标准。)

  8. 每个块都存储为数组中的小块。现在我们使用以下行从这些小斑点创建一个更大的斑点

    soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });

    该 blob 被上传到服务器,该服务器将该 blob 写入为 wav 文件。我们可以将 url 作为音频标签的 src 传递给这个 wav 文件,以便在 javascript 端重放它。

  9. 我们在将 blob 发送到服务器后关闭 websocket 连接。

    希望这足够清楚,可以理解。

于 2013-04-09T17:10:34.430 回答
0

如果您只想播放声音,那么最好使用 iOS 中的一种本地音频播放系统而不是 HTML 音频标签。

于 2013-03-29T17:32:06.040 回答