2

I'm trying to allow users to upload a video in a react-native application, but am having trouble even getting access to the camera.

** Not using Expo. I used the react-native cli and used react-native-init to generate my project **

** Using react-native version 0.53.0. Android version 5.1 **

** UPDATE: After some good advice, I've changed the compileSdkVersions and targetSdkVersions in my android/app/build.gradle file to > 23 and know that the problem isn't with my permissions. Also, the deprecated RCTCamera version of react-native-camera works fine and I'm able to capture still images and video and save them to the device. Only the master version, which uses RNCamera, still crashes the app every time the screen loads :/

This is less than ideal because I don't want to have deprecated code in my app. So when I'm referring to react-native-camera below, I mean the current version that uses RNCamera and not RCTCamera. **

I tried using the react-native-camera package (https://github.com/react-native-community/react-native-camera) but this causes my app to crash every time. Plus I want to use a camera app that the user has already installed, rather than having to build my own camera view, which is what react-native-camera requires.

Looking around, I have stumbled across three promising ways to solve this:

1) Linking - As far as I understand from this StackOverflow post (React native send a message to specific whatsapp Number2) linking can be used to open other apps the user has on their device. I figure that this can be used to access camera apps as well. But I haven't found any info on this. How do I check that a user has a camera app, and then link to them? Ideally a pop-up menu would appear on the user's phone asking the user to choose from a list of available camera apps to use.

2) This post from the android developer's docs - https://developer.android.com/training/camera/videobasics.html . This describes how to do exactly what I want to do, but I am having trouble making a native module for use in my components. I have very basic knowledge of building bridges in react-native, and was only able to make a simple native Toast module work after reading a couple articles laying out all the code. So could anyone write a VideoModule.java file that can be used to implement the same functionality as the android docs specify? This seems like the easiest solution to me, but my lack of knowledge of Java/android is standing in the way.

3) ReactNativeWebRTC - I have already included this module (https://github.com/oney/react-native-webrtc) successfully on a different screen in the application. But as I am using this module to stream video between two peers, it don't see how to use it to upload video. I looked to see if there was something akin to the MediaRecorder API that I am using for the web version of the app, but I haven't had any luck. But I know that if I can get the binary data from the media stream, then I can send this directly to my server. So, is there a way to directly store the media streams from the getUserMedia() method that react-native-webrtc employs in a buffer without a MediaRecorder like on the web?

Any solution would be tremendously helpful here. And since I only have an Android phone to test on currently, I don't need info for how to make this work with iOS. Just a solution for android. Thank you very much.

Here's my AndroidManifest.xml permissions:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW"/>
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus"/>

<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.RECORD_VIDEO" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>

My android/app/build.gradle file:

apply plugin: "com.android.application"

import com.android.build.OutputFile

/**
 * The react.gradle file registers a task for each build variant (e.g. bundleDebugJsAndAssets
 * and bundleReleaseJsAndAssets).
 * These basically call `react-native bundle` with the correct arguments during the Android build
 * cycle. By default, bundleDebugJsAndAssets is skipped, as in debug/dev mode we prefer to load 
the
 * bundle directly from the development server. Below you can see all the possible configurations
 * and their defaults. If you decide to add a configuration block, make sure to add it before the
 * `apply from: "../../node_modules/react-native/react.gradle"` line.
 *
 * project.ext.react = [
 *   // the name of the generated asset file containing your JS bundle
 *   bundleAssetName: "index.android.bundle",
 *
 *   // the entry file for bundle generation
 *   entryFile: "index.android.js",
 *
 *   // whether to bundle JS and assets in debug mode
 *   bundleInDebug: false,
 *
 *   // whether to bundle JS and assets in release mode
 *   bundleInRelease: true,
 *
 *   // whether to bundle JS and assets in another build variant (if configured).
 *   // See http://tools.android.com/tech-docs/new-build-system/user-guide#TOC-Build-Variants
 *   // The configuration property can be in the following formats
  *   //         'bundleIn${productFlavor}${buildType}'
 *   //         'bundleIn${buildType}'
 *   // bundleInFreeDebug: true,
 *   // bundleInPaidRelease: true,
 *   // bundleInBeta: true,
 *
 *   // whether to disable dev mode in custom build variants (by default only disabled in release)
 *   // for example: to disable dev mode in the staging build type (if configured)
 *   devDisabledInStaging: true,
 *   // The configuration property can be in the following formats
 *   //         'devDisabledIn${productFlavor}${buildType}'
 *   //         'devDisabledIn${buildType}'
 *
 *   // the root of your project, i.e. where "package.json" lives
 *   root: "../../",
 *
 *   // where to put the JS bundle asset in debug mode
 *   jsBundleDirDebug: "$buildDir/intermediates/assets/debug",
 *
 *   // where to put the JS bundle asset in release mode
 *   jsBundleDirRelease: "$buildDir/intermediates/assets/release",
 *
 *   // where to put drawable resources / React Native assets, e.g. the ones you use via
 *   // require('./image.png')), in debug mode
 *   resourcesDirDebug: "$buildDir/intermediates/res/merged/debug",
 *
 *   // where to put drawable resources / React Native assets, e.g. the ones you use via
 *   // require('./image.png')), in release mode
 *   resourcesDirRelease: "$buildDir/intermediates/res/merged/release",
 *
 *   // by default the gradle tasks are skipped if none of the JS files or assets change; this means
 *   // that we don't look at files in android/ or ios/ to determine whether the tasks are up to
 *   // date; if you have any other folders that you want to ignore for performance reasons (gradle
 *   // indexes the entire tree), add them here. Alternatively, if you have JS files in android/
 *   // for example, you might want to remove it from here.
 *   inputExcludes: ["android/**", "ios/**"],
 *
 *   // override which node gets called and with what additional arguments
 *   nodeExecutableAndArgs: ["node"],
 *
 *   // supply additional arguments to the packager
 *   extraPackagerArgs: []
 * ]
 */

 project.ext.react = [
    entryFile: "index.js"
 ]

 apply from: "../../node_modules/react-native/react.gradle"

 /**
 * Set this to true to create two separate APKs instead of one:
 *   - An APK that only works on ARM devices
 *   - An APK that only works on x86 devices
 * The advantage is the size of the APK is reduced by about 4MB.
 * Upload all the APKs to the Play Store and people will download
 * the correct one based on the CPU architecture of their device.
 */
 def enableSeparateBuildPerCPUArchitecture = false

  /**
 * Run Proguard to shrink the Java bytecode in release builds.
 */
 def enableProguardInReleaseBuilds = false

android {
    compileSdkVersion 26
    buildToolsVersion "25.0.2"

defaultConfig {
    applicationId "com.slimnative"
    minSdkVersion 16
    targetSdkVersion 26
    versionCode 1
    versionName "1.0"
    ndk {
        abiFilters "armeabi-v7a", "x86"
    }
}
splits {
    abi {
        reset()
        enable enableSeparateBuildPerCPUArchitecture
        universalApk false  // If true, also generate a universal APK
        include "armeabi-v7a", "x86"
    }
}
buildTypes {
    release {
        minifyEnabled enableProguardInReleaseBuilds
        proguardFiles getDefaultProguardFile("proguard-android.txt"), "proguard-rules.pro"
    }
}
// applicationVariants are e.g. debug, release
applicationVariants.all { variant ->
    variant.outputs.each { output ->
        // For each separate APK per architecture, set a unique version code as described here:
        // http://tools.android.com/tech-docs/new-build-system/user-guide/apk-splits
        def versionCodes = ["armeabi-v7a":1, "x86":2]
        def abi = output.getFilter(OutputFile.ABI)
        if (abi != null) {  // null for the universal-debug, universal-release variants
            output.versionCodeOverride =
                    versionCodes.get(abi) * 1048576 + defaultConfig.versionCode
        }
    }
    }
   }

dependencies {
compile fileTree(dir: "libs", include: ["*.jar"])
compile "com.android.support:appcompat-v7:23.0.1"
compile "com.facebook.react:react-native:+"  // From node_modules
compile project(':WebRTCModule')
compile project(':react-native-svg')
compile (project(':react-native-camera')) {
    // exclude group: "com.google.android.gms"
    exclude group: "com.android.support"
}
// compile ("com.google.android.gms:play-services-vision:10.2.0") {
//     force = true;
// }
compile ('com.android.support:exifinterface:26.0.1') {
    force = true;
}
}

// Run this once to be able to run the application with BUCK
// puts all compile dependencies into folder libs for BUCK to use
task copyDownloadableDepsToLibs(type: Copy) {
from configurations.compile
into 'libs'
}

And my android/build.gradle:

buildscript {
repositories {
    jcenter()
}
dependencies {
    classpath 'com.android.tools.build:gradle:2.2.3'
    // NOTE: Do not place your application dependencies here; they belong
    // in the individual module build.gradle files
}
}

allprojects {
repositories {
    mavenLocal()
    jcenter()
    maven {
        // All of React Native (JS, Obj-C sources, Android binaries) is installed from npm
        url "$rootDir/../node_modules/react-native/android"
    }
    maven { url "https://jitpack.io" }
    maven {
        url "https://maven.google.com"
    }
   }
   }
4

3 回答 3

7

嘿@mraaron 我刚刚制作了一个反应原生应用程序,我必须在其中制作和上传视频,基本上你可以使用两种方法:

1) React Native Image Picker https://github.com/react-community/react-native-image-picker打开本地摄像头来录制视频,作为回应会给你路径和其他信息。该模块具有图像和视频的功能,同时定义选项您可以指定 mediaType:iOS 上的“照片”、“视频”或“混合”,Android 上的“照片”或“视频”

.

2)React Native Camera https://github.com/react-native-community/react-native-camera在这个你可以自定义相机窗口,因为它不会打开本机相机应用程序

Note:- I have implemented both the packages and both are working absolutely fine in android as well as ios, if u need any help u can ping me up.
于 2018-02-14T06:32:08.603 回答
3

这是我昨天刚刚制作的演示......如果它有帮助:

import React from 'react';
import { View, Text, Alert } from 'react-native';
import { BarCodeScanner, Permissions } from 'expo';

    class CameraForm extends React.Component {

      state = {
        hasCameraPermission: null
      };

      componentDidMount() {
        this.permissionCheck();
      }

      permissionCheck = async () => {
        const { status } = await Permissions.askAsync(Permissions.CAMERA);
        this.setState({
          hasCameraPermission: status === 'granted'
        });
      };

      handleBarCodeScanRead = data => {
          Alert.alert(
            'Scan successful!',
            JSON.stringify(data)
          );
      };

      render() {
        return (
          <View style={styles.container}>
            <Text>Scan your wallet code</Text>
            { this.state.hasCameraPermission === null ?
                  <Text>Requesting for camera permission</Text> :
                  this.state.hasCameraPermission === false ?
                  <Text>Camera permission is not granted</Text> :
                  <BarCodeScanner
                    onBarCodeRead={this.handleBarCodeScanRead}
                    style={{ height: 400, width: 400, marginTop: 20 }}
                  />
            }
          </View>
        );
      }
    }

    const styles = {
      container: {
        flex: 1,
        alignSelf: 'stretch',
        alignItems: 'center',
        justifyContent: 'center',
        backgroundColor: 'white'
      }
    };

    export default CameraForm

;

于 2018-02-14T05:49:25.290 回答
0

关于应用程序每次崩溃,您是否仔细检查过您是否在 androidmanifest.xml/info.plist 中提到了应用程序所需的所有权限?

另外,您希望使用外部应用程序进行相机视图的具体原因是什么?因为我使用过 react-native-camera 并且可以无缝运行。

于 2018-02-14T04:58:33.097 回答