10

我正在创建 3D Compass 应用程序。

我正在使用getOrientation方法来获取方向(几乎与此处相同的实现)。如果我将手机放在桌子上,效果很好,但是当手机顶部指向天空(图片上减去 Z 轴;球体是地球)时,getOrientation 开始给出非常糟糕的结果。它给出了 Z 轴在 0 到 180 度之间的几个实际度数的值。有什么办法可以抑制这种行为?我制作了一个描述问题的小视频(抱歉质量不好)。提前致谢。

在此处输入图像描述

解决方案:当您旋转模型时,有以下区别:

gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH


gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH
4

4 回答 4

12

好吧,我可以看到您的这种方法至少有一个问题。

我假设您将与磁力计相对应的 3D 矢量与平均低通滤波器相结合以平滑数据。尽管这种方法对于没有间断变化的传感器值非常有效,例如来自加速度计的原始数据,但对于从磁力计获取的角度变量来说,它并没有那么好用。为什么,有人可能会问?

因为这些角度变量(方位角、俯仰角、滚动角)有一个上限和一个下限,这意味着任何高于 180 度的值,比如 181 度,都会环绕到 181-360 = -179 度,并且任何变量-180 度以下会在另一个方向环绕。因此,当其中一个角度变量接近这些阈值(180 或 -180)时,该变量将倾向于振荡到接近这两个极端的值。当你盲目地对这些值应用低通滤波器时,你会得到从 180 度到 -180 度的平滑减小,或者从 -180 到 180 度的平滑增加。无论哪种方式,结果看起来都非常像您上面的视频......只要直接将平均缓冲区应用于原始角度数据getOrientation(...),这个问题将会出现(并且不仅应该出现在手机直立的情况下,而且还应该出现在方位角环绕的情况下......也许你也可以测试这些错误......) .

您说您使用 1 的缓冲区大小对此进行了测试。理论上,如果根本没有平均,则问题不应该存在,尽管在我过去见过的循环缓冲区的某些实现中,这可能意味着存在仍然以至少 1 个过去值进行平均,而不是根本没有平均。如果这是您的情况,我们已经找到了您的错误的根本原因。

不幸的是,在坚持使用标准平均滤波器的同时,没有多少优雅的解决方案可以实现。在这种情况下我通常做的是切换到另一种类型的低通滤波器,它不需要任何深度缓冲区来操作:一个简单的 IIR 滤波器(1 阶):

差异 = x[n] - y[n-1]

y[n] - y[n-1] = alpha * (x[n] - y[n-1]) = alpha * diff

...其中y是滤波后的角度,x是原始角度,并且alpha <1 类似于时间常数,因为 alpha=1 对应于无滤波器的情况,并且低通滤波器的频率截止得到随着 alpha 接近零而降低。现在,敏锐的眼睛可能已经注意到这对应于一个简单的Proportional Controller

这样的过滤器允许补偿角度值的环绕,因为我们可以为diff添加或减去 360以确保abs(diff)<=180,这反过来又确保过滤后的角度值将始终增加/减少达到其“设定点”的最佳方向。

将定期安排的示例函数调用计算给定原始角度值 x 的过滤角度值 y,可能是这样的:

private float restrictAngle(float tmpAngle){
    while(tmpAngle>=180) tmpAngle-=360;
    while(tmpAngle<-180) tmpAngle+=360;
    return tmpAngle;
}

//x is a raw angle value from getOrientation(...)
//y is the current filtered angle value
private float calculateFilteredAngle(float x, float y){ 
    final float alpha = 0.1f;
    float diff = x-y;

    //here, we ensure that abs(diff)<=180
    diff = restrictAngle(diff);

    y += alpha*diff;
    //ensure that y stays within [-180, 180[ bounds
    y = restrictAngle(y);

    return y;
}

然后可以使用类似这样的方法定期调用该函数calculateFilteredAngle(float x, float y)(例如来自函数的方位角getOrientation(...)

filteredAzimuth = calculateFilteredAngle(azimuth, filteredAzimuth);

使用这种方法,滤波器不会像 OP 提到的平均滤波器那样行为不端。

由于我无法加载 OP 上传的 .apk,我决定实施我自己的测试项目,以查看更正是否有效。这是整个代码(它不使用 .XML 作为主布局,所以我没有包含它)。只需将它复制到一个测试项目,看看它是否适用于特定设备(在 HTC Desire w/Android v. 2.1 上测试功能):

文件 1:Compass3DActivity.java:

package com.epichorns.compass3D;

import android.app.Activity;
import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.os.Bundle;
import android.view.ViewGroup;
import android.widget.LinearLayout;
import android.widget.TextView;

public class Compass3DActivity extends Activity {
    //Textviews for showing angle data
    TextView mTextView_azimuth;
    TextView mTextView_pitch;
    TextView mTextView_roll;

    TextView mTextView_filtered_azimuth;
    TextView mTextView_filtered_pitch;
    TextView mTextView_filtered_roll;


    float mAngle0_azimuth=0;
    float mAngle1_pitch=0;
    float mAngle2_roll=0;

    float mAngle0_filtered_azimuth=0;
    float mAngle1_filtered_pitch=0;
    float mAngle2_filtered_roll=0;

    private Compass3DView mCompassView;

    private SensorManager sensorManager;
    //sensor calculation values
    float[] mGravity = null;
    float[] mGeomagnetic = null;
    float Rmat[] = new float[9];
    float Imat[] = new float[9];
    float orientation[] = new float[3];
    SensorEventListener mAccelerometerListener = new SensorEventListener(){
        public void onAccuracyChanged(Sensor sensor, int accuracy) {}

        public void onSensorChanged(SensorEvent event) {
            if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER){
                mGravity = event.values.clone();
                processSensorData();
            }
        }   
    };
    SensorEventListener mMagnetometerListener = new SensorEventListener(){
        public void onAccuracyChanged(Sensor sensor, int accuracy) {}

        public void onSensorChanged(SensorEvent event) {
            if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD){
                mGeomagnetic = event.values.clone();
                processSensorData();                
                update();
            }
        }   
    };

    private float restrictAngle(float tmpAngle){
        while(tmpAngle>=180) tmpAngle-=360;
        while(tmpAngle<-180) tmpAngle+=360;
        return tmpAngle;
    }

    //x is a raw angle value from getOrientation(...)
    //y is the current filtered angle value
    private float calculateFilteredAngle(float x, float y){ 
        final float alpha = 0.3f;
        float diff = x-y;

        //here, we ensure that abs(diff)<=180
        diff = restrictAngle(diff);

        y += alpha*diff;
        //ensure that y stays within [-180, 180[ bounds
        y = restrictAngle(y);

        return y;
    }



    public void processSensorData(){
        if (mGravity != null && mGeomagnetic != null) { 
            boolean success = SensorManager.getRotationMatrix(Rmat, Imat, mGravity, mGeomagnetic);
            if (success) {              
                SensorManager.getOrientation(Rmat, orientation);
                mAngle0_azimuth = (float)Math.toDegrees((double)orientation[0]); // orientation contains: azimut, pitch and roll
                mAngle1_pitch = (float)Math.toDegrees((double)orientation[1]); //pitch
                mAngle2_roll = -(float)Math.toDegrees((double)orientation[2]); //roll               
                mAngle0_filtered_azimuth = calculateFilteredAngle(mAngle0_azimuth, mAngle0_filtered_azimuth);
                mAngle1_filtered_pitch = calculateFilteredAngle(mAngle1_pitch, mAngle1_filtered_pitch);
                mAngle2_filtered_roll = calculateFilteredAngle(mAngle2_roll, mAngle2_filtered_roll);    
            }           
            mGravity=null; //oblige full new refresh
            mGeomagnetic=null; //oblige full new refresh
        }
    }

    /** Called when the activity is first created. */
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);     
        LinearLayout ll = new LinearLayout(this);       
        LinearLayout.LayoutParams llParams = new LinearLayout.LayoutParams(LinearLayout.LayoutParams.FILL_PARENT, LinearLayout.LayoutParams.FILL_PARENT);      
        ll.setLayoutParams(llParams);      
        ll.setOrientation(LinearLayout.VERTICAL);      
        ViewGroup.LayoutParams txtParams = new ViewGroup.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT);        
        mTextView_azimuth = new TextView(this);
        mTextView_azimuth.setLayoutParams(txtParams);
        mTextView_pitch = new TextView(this);
        mTextView_pitch.setLayoutParams(txtParams);
        mTextView_roll = new TextView(this);
        mTextView_roll.setLayoutParams(txtParams);      
        mTextView_filtered_azimuth = new TextView(this);
        mTextView_filtered_azimuth.setLayoutParams(txtParams);
        mTextView_filtered_pitch = new TextView(this);
        mTextView_filtered_pitch.setLayoutParams(txtParams);
        mTextView_filtered_roll = new TextView(this);
        mTextView_filtered_roll.setLayoutParams(txtParams);

        mCompassView = new Compass3DView(this);        
        ViewGroup.LayoutParams compassParams = new ViewGroup.LayoutParams(200,200);
        mCompassView.setLayoutParams(compassParams);

        ll.addView(mCompassView);
        ll.addView(mTextView_azimuth);
        ll.addView(mTextView_pitch);
        ll.addView(mTextView_roll);
        ll.addView(mTextView_filtered_azimuth);
        ll.addView(mTextView_filtered_pitch);
        ll.addView(mTextView_filtered_roll);

        setContentView(ll);

        sensorManager = (SensorManager) this.getSystemService(Context.SENSOR_SERVICE);
        sensorManager.registerListener(mAccelerometerListener, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_UI); 
        sensorManager.registerListener(mMagnetometerListener, sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD), SensorManager.SENSOR_DELAY_UI);
        update();       
    }


    @Override
    public void onDestroy(){
        super.onDestroy();
        sensorManager.unregisterListener(mAccelerometerListener);
        sensorManager.unregisterListener(mMagnetometerListener);
    }


    private void update(){
        mCompassView.changeAngles(mAngle1_filtered_pitch,  mAngle2_filtered_roll, mAngle0_filtered_azimuth);

        mTextView_azimuth.setText("Azimuth: "+String.valueOf(mAngle0_azimuth));
        mTextView_pitch.setText("Pitch: "+String.valueOf(mAngle1_pitch));
        mTextView_roll.setText("Roll: "+String.valueOf(mAngle2_roll));

        mTextView_filtered_azimuth.setText("Azimuth: "+String.valueOf(mAngle0_filtered_azimuth));
        mTextView_filtered_pitch.setText("Pitch: "+String.valueOf(mAngle1_filtered_pitch));
        mTextView_filtered_roll.setText("Roll: "+String.valueOf(mAngle2_filtered_roll));

    }
}

文件 2:Compass3DView.java:

package com.epichorns.compass3D;

import android.content.Context;
import android.opengl.GLSurfaceView;

public class Compass3DView extends GLSurfaceView {
    private Compass3DRenderer mRenderer;

    public Compass3DView(Context context) {
        super(context);
        mRenderer = new Compass3DRenderer(context);
        setRenderer(mRenderer);
    }

    public void changeAngles(float angle0, float angle1, float angle2){
        mRenderer.setAngleX(angle0);
        mRenderer.setAngleY(angle1);
        mRenderer.setAngleZ(angle2);
    }

}

文件 3:Compass3DRenderer.java:

package com.epichorns.compass3D;


import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

import android.content.Context;
import android.opengl.GLSurfaceView;


public class Compass3DRenderer implements GLSurfaceView.Renderer {
    Context mContext;

    // a raw buffer to hold indices
    ShortBuffer _indexBuffer;    
    // raw buffers to hold the vertices
    FloatBuffer _vertexBuffer0;
    FloatBuffer _vertexBuffer1;
    FloatBuffer _vertexBuffer2;
    FloatBuffer _vertexBuffer3;
    FloatBuffer _vertexBuffer4;
    FloatBuffer _vertexBuffer5;
    int _numVertices = 3; //standard triangle vertices = 3

    FloatBuffer _textureBuffer0123;



    //private FloatBuffer _light0Position;
    //private FloatBuffer _light0Ambient;
    float _light0Position[] = new float[]{10.0f, 10.0f, 10.0f, 0.0f};
    float _light0Ambient[] = new float[]{0.05f, 0.05f, 0.05f, 1.0f};
    float _light0Diffuse[] = new float[]{0.5f, 0.5f, 0.5f, 1.0f};
    float _light0Specular[] = new float[]{0.7f, 0.7f, 0.7f, 1.0f};
    float _matAmbient[] = new float[] { 0.6f, 0.6f, 0.6f, 1.0f };
    float _matDiffuse[] = new float[] { 0.6f, 0.6f, 0.6f, 1.0f };




    private float _angleX=0f;
    private float _angleY=0f;
    private float _angleZ=0f;


    Compass3DRenderer(Context context){
        super();
        mContext = context;
    }

    public void setAngleX(float angle) {
        _angleX = angle;
    }

    public void setAngleY(float angle) {
        _angleY = angle;
    }

    public void setAngleZ(float angle) {
        _angleZ = angle;
    }

    FloatBuffer InitFloatBuffer(float[] src){
        ByteBuffer bb = ByteBuffer.allocateDirect(4*src.length);
        bb.order(ByteOrder.nativeOrder());
        FloatBuffer inBuf = bb.asFloatBuffer();
        inBuf.put(src);
        return inBuf;
    }

    ShortBuffer InitShortBuffer(short[] src){
        ByteBuffer bb = ByteBuffer.allocateDirect(2*src.length);
        bb.order(ByteOrder.nativeOrder());
        ShortBuffer inBuf = bb.asShortBuffer();
        inBuf.put(src);
        return inBuf;
    }

    //Init data for our rendered pyramid
    private void initTriangles() {

        //Side faces triangles
        float[] coords = {
            -0.25f, -0.5f, 0.25f,
            0.25f, -0.5f, 0.25f,
            0f, 0.5f, 0f
        };

        float[] coords1 = {
            0.25f, -0.5f, 0.25f,
            0.25f, -0.5f, -0.25f,
            0f, 0.5f, 0f
        };

        float[] coords2 = {
            0.25f, -0.5f, -0.25f,
            -0.25f, -0.5f, -0.25f,
            0f, 0.5f, 0f
        };

        float[] coords3 = {
            -0.25f, -0.5f, -0.25f,
            -0.25f, -0.5f, 0.25f,
            0f, 0.5f, 0f
        };

        //Base triangles
        float[] coords4 = {
            -0.25f, -0.5f, 0.25f,
            0.25f, -0.5f, -0.25f,
            0.25f, -0.5f, 0.25f
        };

        float[] coords5 = {
            -0.25f, -0.5f, 0.25f,
            -0.25f, -0.5f, -0.25f, 
            0.25f, -0.5f, -0.25f
        };


        float[] textures0123 = {
                // Mapping coordinates for the vertices (UV mapping CW)
                0.0f, 0.0f,     // bottom left                    
                1.0f, 0.0f,     // bottom right
                0.5f, 1.0f,     // top ctr              
        };


        _vertexBuffer0 = InitFloatBuffer(coords);
        _vertexBuffer0.position(0);

        _vertexBuffer1 = InitFloatBuffer(coords1);
        _vertexBuffer1.position(0);    

        _vertexBuffer2 = InitFloatBuffer(coords2);
        _vertexBuffer2.position(0);

        _vertexBuffer3 = InitFloatBuffer(coords3);
        _vertexBuffer3.position(0);

        _vertexBuffer4 = InitFloatBuffer(coords4);
        _vertexBuffer4.position(0);

        _vertexBuffer5 = InitFloatBuffer(coords5);
        _vertexBuffer5.position(0);

        _textureBuffer0123 = InitFloatBuffer(textures0123);
        _textureBuffer0123.position(0);

        short[] indices = {0, 1, 2};
        _indexBuffer = InitShortBuffer(indices);        
        _indexBuffer.position(0);

    }


    public void onSurfaceCreated(GL10 gl, EGLConfig config) {

        gl.glEnable(GL10.GL_CULL_FACE); // enable the differentiation of which side may be visible 
        gl.glShadeModel(GL10.GL_SMOOTH);

        gl.glFrontFace(GL10.GL_CCW); // which is the front? the one which is drawn counter clockwise
        gl.glCullFace(GL10.GL_BACK); // which one should NOT be drawn

        initTriangles();

        gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
        gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
    }

    public void onDrawFrame(GL10 gl) {


        gl.glPushMatrix();

        gl.glClearColor(0, 0, 0, 1.0f); //clipping backdrop color
        // clear the color buffer to show the ClearColor we called above...
        gl.glClear(GL10.GL_COLOR_BUFFER_BIT);

        // set rotation       
        gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
        gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
        gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH

        //Draw our pyramid

        //4 side faces
        gl.glColor4f(0.5f, 0f, 0f, 0.5f);
        gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer0);
        gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);

        gl.glColor4f(0.5f, 0.5f, 0f, 0.5f);
        gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer1);
        gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);

        gl.glColor4f(0f, 0.5f, 0f, 0.5f);
        gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer2);
        gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);

        gl.glColor4f(0f, 0.5f, 0.5f, 0.5f);
        gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer3);
        gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);

        //Base face
        gl.glColor4f(0f, 0f, 0.5f, 0.5f);
        gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer4);
        gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
        gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer5);
        gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);

        gl.glPopMatrix();
    }

    public void onSurfaceChanged(GL10 gl, int w, int h) {
        gl.glViewport(0, 0, w, h);
        gl.glViewport(0, 0, w, h);

    }



}

请注意,此代码不会补偿平板电脑默认的横向方向,因此它只能在手机上正常工作(我附近没有平板电脑来测试任何校正代码)。

于 2012-04-29T05:36:58.397 回答
3

您可能应该尝试更长的延迟,例如 Game 和/或保持/增加循环缓冲区的大小。移动设备上的传感器(加速度计、指南针等)本质上是嘈杂的,所以当我问及“低通滤波器”时,我的意思是你是否使用更多数据来降低应用可用更新的频率。您的视频是在室内完成的,我还建议您去一个电磁干扰较少的地方,例如公园,以检查行为是否一致以及标准指南针重置操作(图 8 中的旋转设备)。最后,您可能必须应用一些启发式方法来丢弃“坏”数据,从而为用户提供更流畅的体验。

于 2012-04-17T17:28:28.253 回答
1

好吧,我在检索方向时遇到了完全相同的问题。问题是我没有得到解决(在检索设备位置时我必须设置一个约束),我不知道你是否能够做到。

选择一个磁罗盘并在罗盘处于您描述的情况时尝试获取北方方向 - 您将得到相同的无意义结果。所以你不能指望设备的指南针做得更好!

于 2012-04-24T15:19:49.697 回答
0

几句话关于过滤,你的权限。

  1. 我建议在将其转换为角度之前对磁场矢量本身进行平均。
  2. 仅对角度进行平均/平滑而不使用某种幅度是错误的。角度本身并没有提供足够的数据来检测方向/航向/方位。示例:当您想知道一整天的平均风向时,您必须使用风的强度,而不仅仅是角度。如果你只平均角度,你会得到绝对错误的风向。至于轴承方向,我会使用速度来表示大小。
于 2014-01-13T15:50:53.587 回答