0

我一直致力于在 xamarin iOS Binding 项目中使用 OpenEars v2.03 iOS 框架项目。让我解释一下我到目前为止所做的事情。我是 XCode、Xamarin 和所有这些 Binding 的新手。这将是一个大问题,所以请屏住呼吸……</p>

1) 在 Xcode for Simulator 中构建 OpenEars 框架项目。从Framework/OpenEars.framework/Versions/Current/复制“OpenEars”文件并重命名为“<strong>libOpenEars-i386.a”</p>

同样,通过将设备连接到 Mac 并为我的 iPhone 选择目标,为 iPhone 4s 设备构建相同的库。最后复制生成的OpenEars,重命名为“<strong>libOpenEars-armv7.a”</p>

2) 使用lipo命令将两个文件(libOpenEars-i386.a、libOpenEars-armv7.a)捆绑到一个文件“libOpenEars.a”,使用以下命令。

lipo -create -output libOpenEars.a libOpenEars-i386.a libOpenEars-armv7.a 

3) 在 Xamarin Studio 中创建一个 Binding 项目并添加 libOpenEars.a,它会自动生成一个libOpenEars.linkwith.cs。下面是下面的代码,

using System;
using ObjCRuntime;

[assembly: LinkWith ("libOpenEars.a", LinkTarget.ArmV7 | LinkTarget.Simulator, SmartLink = true, ForceLoad = true, Frameworks="AudioToolbox AVFoundation", IsCxx=true, LinkerFlags = "-lstdc++")]

我尝试更改喜欢的标志 LinkerFlags = "-lstdc++ -lc++ -ObjC" 和 SmartLink=false。

4) 我的 ApiDefinition 文件包含了 OpenEars 的所有接口,我这里只添加了一个接口。

[BaseType(typeof(NSObject))]
[Protocol]
interface OEEventsObserver
{
    [Wrap ("WeakDelegate")]
    OEEventsObserverDelegate Delegate { get; set; }

    [Export ("delegate", ArgumentSemantic.Assign), NullAllowed]
    NSObject WeakDelegate { get; set; }
}

5) 将 OpenEars.dll 引用到我的 iOS 示例项目中。

6)在Binding库本身中添加语言模型和声学模型。(即使动态语言模型生成不需要它,我使用了这个OpenEars Xamarin git中的旧 OpenEars 示例项目,我没有使用新的 DynamicLanguageModel 生成器,但修改了示例以进行最新更改)。

视图控制器:

public partial class OpenEarsNewApiViewController : UIViewController
{
    OEEventsObserver observer;
    OEFliteController fliteController;
    OEPocketsphinxController pocketSphinxController;


    String pathToLanguageModel;
    String pathToDictionary;
    String pathToAcousticModel;

    String firstVoiceToUse;
    String secondVoiceToUse;

    static bool UserInterfaceIdiomIsPhone {
        get { return UIDevice.CurrentDevice.UserInterfaceIdiom == UIUserInterfaceIdiom.Phone; }
    }

    public void init()
    {
        try
        {
            observer = new OEEventsObserver();
            observer.Delegate = new OpenEarsEventsObserverDelegate (this);
            pocketSphinxController = new OEPocketsphinxController ();

            fliteController = new OEFliteController();

            firstVoiceToUse = "cmu_us_slt";
            secondVoiceToUse = "cmu_us_rms";

            pathToLanguageModel = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.languagemodel";
            pathToDictionary = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.dic";
            pathToAcousticModel = NSBundle.MainBundle.ResourcePath;
        }
        catch(Exception e) {
            Console.WriteLine ("Exception Message :"+e.Message);
            Console.WriteLine ("Inner Exception Mesage :"+e.InnerException.Message);
        }

    }

    public OpenEarsNewApiViewController (IntPtr handle) : base (handle)
    {
        init ();
    }

    #region Update

    public void UpdateStatus (String text)
    {
        txtStatus.Text = text;
    }

    public void UpdateText (String text)
    {
        txtOutput.Text = text;
    }

    public void UpdateButtonStates (bool hidden1, bool hidden2, bool hidden3, bool hidden4)
    {
        btnStartListening.Hidden = hidden1;
        btnStopListening.Hidden = hidden2;
        btnSuspend.Hidden = hidden3;
        btnResume.Hidden = hidden4;
    }

    public void Say (String text)
    {
        //fliteController.SaywithVoice (text, secondVoiceToUse);
    }

    public void StartListening ()
    {
        //pocketSphinxController.RequestMicPermission ();
        if (!pocketSphinxController.IsListening) {

            //NSString *correctPathToMyLanguageModelFile = [NSString stringWithFormat:@"%@/TheNameIChoseForMyLanguageModelAndDictionaryFile.%@",[NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0],@"DMP"];


            pocketSphinxController.StartListeningWithLanguageModelAtPath (
                pathToLanguageModel,
                pathToDictionary,
                pathToAcousticModel,
                false
            );
        } else {
            new UIAlertView ("Notify !!","Already Listening",null,"OK","Stop").Show();

        }

    }

    public void StopListening ()
    {
        //pocketSphinxController.StopListening ();
    }

    public void SuspendRecognition ()
    {
        pocketSphinxController.SuspendRecognition ();
    }

    public void ResumeRecognition ()
    {
        pocketSphinxController.ResumeRecognition ();
    }

    #endregion

    #region Event Handlers

    partial void btnStartListening_TouchUpInside (UIButton sender)
    {
        try
        {
            StartListening();
            //fliteController.Init();
            //Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
            //fliteController.Say("Hai", new OEFliteVoice());

            UpdateButtonStates (true, false, false, true);
            Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
        }
        catch(Exception e)
        {
            Console.WriteLine(e.Message);
        }
    }

    partial void btnStopListening_TouchUpInside (UIButton sender)
    {
        StopListening ();
        UpdateButtonStates (false, true, true, true);
    }

    partial void btnSuspend_TouchUpInside (UIButton sender)
    {
        SuspendRecognition ();
        UpdateButtonStates (true, false, true, false);
    }

    partial void btnResume_TouchUpInside (UIButton sender)
    {
        ResumeRecognition ();
        UpdateButtonStates (true, false, false, true);
    }
}


<strong>OpenEarsEventsObserverDelegate:

// nothing much here just to check the status and debugging 

public class OpenEarsEventsObserverDelegate:OEEventsObserverDelegate
{
    OpenEarsNewApiViewController _controller;

    public OpenEarsNewApiViewController controller {
        get {
            return _controller;
        }
        set {
            _controller = value;
        }
    }

    public OpenEarsEventsObserverDelegate (OpenEarsNewApiViewController ctrl)
    {
        controller = ctrl;
    }

    public override void PocketsphinxRecognitionLoopDidStart()
    {
        //base.PocketsphinxRecognitionLoopDidStart();

        Console.WriteLine ("Pocketsphinx is starting up");
        controller.UpdateStatus ("Pocketsphinx is starting up");
    }

    public override void PocketsphinxDidReceiveHypothesis (Foundation.NSString hypothesis, Foundation.NSString recognitionScore, Foundation.NSString utteranceID)
    {
        controller.UpdateText ("Heard: " + hypothesis);
        controller.Say ("You said: " + hypothesis);
    }

    public override void PocketSphinxContinuousSetupDidFail ()
    {

    }

    public override void PocketsphinxDidCompleteCalibration ()
    {
        Console.WriteLine ("Pocket calibration is complete");
        controller.UpdateStatus ("Pocket calibratio is complete");
    }

    public override void PocketsphinxDidDetectSpeech ()
    {

    }

    public override void PocketsphinxDidStartListening ()
    {
        Console.WriteLine ("Pocketsphinx is now listening");
        controller.UpdateStatus ("Pocketphinx is now listening");
        controller.UpdateButtonStates (true, false, false, true);
    }

    public override void PocketsphinxDidStopListening ()
    {

    }

    public override void PocketsphinxDidStartCalibration ()
    {
        Console.WriteLine ("Pocketsphinx calibration has started.");
        controller.UpdateStatus ("Pocketsphinx calibration has started");
    }

    public override void PocketsphinxDidResumeRecognition ()
    {

    }

    public override void PocketsphinxDidSuspendRecognition ()
    {

    }

    public override void PocketsphinxDidDetectFinishedSpeech ()
    {

    }

    public override void FliteDidStartSpeaking ()
    {

    }

    public override void FliteDidFinishSpeaking ()
    {

    }
}

这在 iOS 模拟器上完美运行,但不能在真实设备上运行。

模拟器屏幕截图。

我在设备上运行时收到此错误消息。所有接口都收到相同的消息。

Exception Message :Wrapper type 'OpenEars.OEEventsObserver' is missing its native ObjectiveC class 'OEEventsObserver'.

2015-05-15 12:55:26.996 OpenEarsNewApi[1359:231264] Unhandled managed  exception: Exception has been thrown by the target of an invocation.  (System.Reflection.TargetInvocationException)
at System.Reflection.MonoCMethod.InternalInvoke (System.Object obj,   System.Object[] parameters) [0x00016] in   /Developer/MonoTouch/Source/mono/mcs/class/corlib/System.Reflection/MonoMethod.cs:543 

我是否缺少与设备绑定相关的任何内容?

我也尝试使用 make 文件构建相同的 .dll,但得到了相同的错误消息。

构建 OpenEars 框架:

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphonesimulator8.2 -arch i386 -configuration Release clean build

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphoneos -arch armv7 -configuration Release clean build

生成 OpenEars.dll 的 MAKE 文件

BTOUCH=/Developer/MonoTouch/usr/bin/btouch-native

all: OpenEars.dll


OpenEars.dll: AssemblyInfo.cs OpenEars.cs libOpenEars.a
$(BTOUCH) -unsafe --new-style -out:$@ OpenEars.cs -x=AssemblyInfo.cs --link-with=libOpenEars.a,libOpenEars.a

clean:
   -rm -f *.dll

在此处查看完整的 mtouch错误日志

$lipo -info libOpenEars.a

Architectures in the fat file: libOpenEars.a are: i386 armv7 

检查 $nm -arch armv7 libOpenEars.a

nm 命令在这里输出

检查模拟器中存在的 OEEvent (i386)

$ nm -arch i386 libOpenEars.a | grep OEEvent

输出

U _OBJC_CLASS_$_OEEventsObserver
00006aa0 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000076f0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00002174 S _OBJC_CLASS_$_OEEventsObserver
00002170 S _OBJC_IVAR_$_OEEventsObserver._delegate
00002188 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002d90 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000035a0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

检查了 OEEvent 存在于 armv7

$nm -arch armv7 libOpenEars.a | grep OEEvent

输出

 U _OBJC_CLASS_$_OEEventsObserver
00005680 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000062d8 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning:    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00001cb4 S _OBJC_CLASS_$_OEEventsObserver
00001cb0 S _OBJC_IVAR_$_OEEventsObserver._delegate
00001cc8 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002638 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
00002e50 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

我不确定我错过了什么。是的,有很多语法错误,我感谢您花时间阅读本文。

4

1 回答 1

0

感谢@poupou 和@Halle 提出宝贵意见。最后,我使用包括 arm64 和 x86_64 在内的所有架构(必须)构建胖二进制文件。使用 lipo 将所有内容构建在一个包中。现在就像魅力一样!... 还设置项目属性-> 高级-> SupportedArchi。-> ARMv7 用于在 ipad 2 和 iPhone 4 等设备上运行。仍然需要在 iPhone 6 和 6+ 上进行测试,我希望它们也可以支持,因为它们是 arm64 系列。我不确定这在 ARMv7 (如 iPhone 5、iPhone 5c、iPad 4)上是如何工作的。我没有在 OpenEars v2.03 中看到对 ARMv7s 的支持。

于 2017-07-11T14:10:48.470 回答