0

所以我在使用 Microsoft 的 Emotion API for Android 时遇到了麻烦。我对运行人脸 API 没有任何问题;我能够得到面部矩形,但我无法让它在情感 api 上工作。我正在使用内置的 Android 相机本身拍摄图像。这是我正在使用的代码:

private void detectAndFrame(final Bitmap imageBitmap)
{
    ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
    imageBitmap.compress(Bitmap.CompressFormat.PNG, 100, outputStream);
    ByteArrayInputStream inputStream =
            new ByteArrayInputStream(outputStream.toByteArray());
    AsyncTask<InputStream, String, List<RecognizeResult>> detectTask =
            new AsyncTask<InputStream, String, List<RecognizeResult>>() {
                @Override
                protected List<RecognizeResult> doInBackground(InputStream... params) {
                    try {
                        Log.e("i","Detecting...");
                        faces = faceServiceClient.detect(
                                params[0],
                                true,         // returnFaceId
                                false,        // returnFaceLandmarks
                                null           // returnFaceAttributes: a string like "age, gender"
                        );
                        if (faces == null)
                        {
                            Log.e("i","Detection Finished. Nothing detected");
                            return null;
                        }
                        Log.e("i",
                                String.format("Detection Finished. %d face(s) detected",
                                        faces.length));
                        ImageView imageView = (ImageView)findViewById(R.id.imageView);
                        InputStream stream = params[0];
                        com.microsoft.projectoxford.emotion.contract.FaceRectangle[] rects = new com.microsoft.projectoxford.emotion.contract.FaceRectangle[faces.length];
                        for (int i = 0; i < faces.length; i++) {
                            com.microsoft.projectoxford.face.contract.FaceRectangle rect = faces[i].faceRectangle;
                            rects[i] = new com.microsoft.projectoxford.emotion.contract.FaceRectangle(rect.left, rect.top, rect.width, rect.height);
                        }
                        List<RecognizeResult> result;
                        result =  client.recognizeImage(stream, rects);
                        return result;
                    } catch (Exception e) {
                        Log.e("e", e.getMessage());
                        Log.e("e", "Detection failed");
                        return null;
                    }
                }
                @Override
                protected void onPreExecute() {
                    //TODO: show progress dialog
                }
                @Override
                protected void onProgressUpdate(String... progress) {
                    //TODO: update progress
                }
                @Override
                protected void onPostExecute(List<RecognizeResult> result) {
                    ImageView imageView = (ImageView)findViewById(R.id.imageView);
                    imageView.setImageBitmap(drawFaceRectanglesOnBitmap(imageBitmap, faces));
                    MediaStore.Images.Media.insertImage(getContentResolver(), imageBitmap, "AnImage" ,"Another image");
                    if (result == null) return;
                    for (RecognizeResult res: result) {
                        Scores scores = res.scores;
                        Log.e("Anger: ", ((Double)scores.anger).toString());
                        Log.e("Neutral: ", ((Double)scores.neutral).toString());
                        Log.e("Happy: ", ((Double)scores.happiness).toString());
                    }

                }
            };
    detectTask.execute(inputStream);
}

我不断收到错误 Post Request 400,表明 JSON 或面部矩形存在某种问题。但我不确定从哪里开始调试这个问题。

4

1 回答 1

1

你使用了两次流,所以第二次你已经在流的末尾了。因此,您可以重置流,或者直接调用不带矩形的情绪 API(即跳过对面部 API 的调用。)情绪 API 将为您确定面部矩形。

于 2016-06-21T08:31:32.247 回答