1

我想创建一个读取用户手势(基于加速度计)并识别它的项目,我搜索了很多但我发现都太旧了,我在分类和识别方面都没有问题,我将使用 1 美元识别器或 HMM ,我只想知道如何使用加速度计读取用户的手势。

加速度计数据(x、y、z 值)是否足够,或者我应该使用其他数据,如姿态数据(滚动、俯仰、偏航)、陀螺仪数据或幅度数据,我什至不了解其中的任何一个,所以解释什么这些传感器是否有用。

提前致谢 !

4

1 回答 1

7

Finally i did it, i used userAcceleration data which is device acceleration due to device excluding gravity, i found a lot of people use the normal acceleration data and do a lot of math to remove gravity from it, now it's already done by iOS 6 in userAcceleration.

And i used 1$ recognizer which is a 2D recongnizer (i.e. point(5, 10), no Z).
Here's a link for 1$ recognizer, there's a c++ version of it in the downloads section.

Here are the steps of my code...

  1. Read userAcceleration data with frequancy 50 HZ.
  2. Apply low pass filter on it.
  3. Take a point into consideration only if its x or y values are greater than 0.05 to reduce noise.
    (Note: The next step depends on your code and on the recognizer you use).
  4. Save x and y points into array.
  5. Create a 2D path from this array.
  6. Send this path to the recognizer to weather train it or recongize it.

Here's my code...

@implementation MainViewController {
    double previousLowPassFilteredAccelerationX;
double previousLowPassFilteredAccelerationY;
double previousLowPassFilteredAccelerationZ;

    CGPoint position;
    int numOfTrainedGestures;
    GeometricRecognizer recognizer;
}

- (void)viewDidLoad
{
    [super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.

    previousLowPassFilteredAccelerationX = previousLowPassFilteredAccelerationY = previousLowPassFilteredAccelerationZ = 0.0;

    recognizer = GeometricRecognizer();

    //Note: I let the user train his own gestures, so i start up each time with 0 gestures
    numOfTrainedGestures = 0;
}

#define kLowPassFilteringFactor 0.1
#define MOVEMENT_HZ 50
#define NOISE_REDUCTION 0.05

- (IBAction)StartAccelerometer
{
    CMMotionManager *motionManager = [CMMotionManager SharedMotionManager];
    if ([motionManager isDeviceMotionAvailable])
    {
        [motionManager setDeviceMotionUpdateInterval:1.0/MOVEMENT_HZ];
        [motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
                                       withHandler: ^(CMDeviceMotion *motion, NSError *error)
         {
             CMAcceleration lowpassFilterAcceleration, userAcceleration = motion.userAcceleration;

             lowpassFilterAcceleration.x = (userAcceleration.x * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationX * (1.0 - kLowPassFilteringFactor));
             lowpassFilterAcceleration.y = (userAcceleration.y * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationY * (1.0 - kLowPassFilteringFactor));
             lowpassFilterAcceleration.z = (userAcceleration.z * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationZ * (1.0 - kLowPassFilteringFactor));

             if (lowpassFilterAcceleration.x > NOISE_REDUCTION || lowpassFilterAcceleration.y > NOISE_REDUCTION)
                 [self.points addObject:[NSString stringWithFormat:@"%.2f,%.2f", lowpassFilterAcceleration.x, lowpassFilterAcceleration.y]];

             previousLowPassFilteredAccelerationX = lowpassFilterAcceleration.x;
             previousLowPassFilteredAccelerationY = lowpassFilterAcceleration.y;
             previousLowPassFilteredAccelerationZ = lowpassFilterAcceleration.z;


             // Just viewing the points to the user
             self.XLabel.text = [NSString stringWithFormat:@"X : %.2f", lowpassFilterAcceleration.x];
             self.YLabel.text = [NSString stringWithFormat:@"Y : %.2f", lowpassFilterAcceleration.y];
             self.ZLabel.text = [NSString stringWithFormat:@"Z : %.2f", lowpassFilterAcceleration.z];
         }];
    }
    else NSLog(@"DeviceMotion is not available");
}


- (IBAction)StopAccelerometer
{
    [[CMMotionManager SharedMotionManager] stopDeviceMotionUpdates];

    // View all the points to the user
    self.pointsTextView.text = [NSString stringWithFormat:@"%d\n\n%@", self.points.count, [self.points componentsJoinedByString:@"\n"]];

    // There must be more that 2 trained gestures because in recognizing, it gets the closest one in distance
    if (numOfTrainedGestures > 1) {
        Path2D path = [self createPathFromPoints]; // A method to create a 2D path from pointsArray
        if (path.size()) {
            RecognitionResult recongnitionResult = recognizer.recognize(path);
            self.recognitionLabel.text = [NSString stringWithFormat:@"%s Detected with Prob %.2f !", recongnitionResult.name.c_str(),
                                      recongnitionResult.score];
        } else self.recognitionLabel.text = @"Not enough points for gesture !";
    }
    else self.recognitionLabel.text = @"Not enough templates !";

    [self releaseAllVariables];
}
于 2013-05-17T09:36:09.900 回答