使用 Android NDK 获取 YUV420p摄像头原始数据·

使用 Android NDK 获取 YUV420p摄像头原始数据

首先frameworks/av/camera/Camera.cpp已经过时了不要再使用它了, 当然想要更换旧的Camera的成本也不小,一般公司也不会做.
先介绍一一些常见的数据格式,然后介绍一下使用方式即可,然后下篇文件在探索一下源码.
脉络大概如下:
CameraManager → CameraService → Camera HAL v3 → Sensor/Driver.

常见的视频原始数据格式

本质上视频就是一张一张的图片,利用人眼视觉暂留的原理,24帧率的时候人眼就会无法辨别出单幅的静态画面.
编码就是利用算法算出每张图片之间的关系然后进行压缩.
解码就是一个逆向的过程,将压缩后的数据利用逆向算法恢复成一张一张的图片,然后播放.

yuv420p

最常见得
这个是最常见的.举个例子:
4x2像素的图片存储格式如下:
首先Y分量和像素一样,如下:
YYYY
YYYY
接着是U分量,4个Y分量共用一个U分量.
UU
接着是V分量,同理
VV
最终在内存中如下:

YYYY
YYYY
UU
VV

5x3像素的图片存储格式如下:
首先Y分量和像素一样,如下:

YYYYY
YYYYY
UU
VV

他们一共在内存中占用15 + 2 + 2 = 19字节.
YU12

YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
UU
VV
UU
VV

YU21

YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
VV
UU
VV
UU

yuv420sp

它和yuv420p得区别在于前者UV是顺序存储,后者是交替存储.
yuv420sp分为NV12NV21
NV12
4x8

YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
UV
UV
UV
UV

NV21
4x8

YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
YYYY
VU
VU
VU
VU

源码封装

cmake


# For more information about using CMake with Android Studio, read the
# documentation: https://d.android.com/studio/projects/add-native-code.html.
# For more examples on how to use CMake, see https://github.com/android/ndk-samples.

# Sets the minimum CMake version required for this project.
cmake_minimum_required(VERSION 3.22.1)

# Declares the project name. The project name can be accessed via ${ PROJECT_NAME},
# Since this is the top level CMakeLists.txt, the project name is also accessible
# with ${CMAKE_PROJECT_NAME} (both CMake variables are in-sync within the top level
# build script scope).
project(openslLearn VERSION 0.1.0 LANGUAGES C CXX)

# ✅ 设置 C++ 标准
set(CMAKE_CXX_STANDARD 23)  # 使用 C++26 标准
set(CMAKE_CXX_STANDARD_REQUIRED ON)  # 强制使用指定标准
set(CMAKE_CXX_EXTENSIONS OFF)        # 禁用编译器扩展(使用纯标准)

# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds them for you.
# Gradle automatically packages shared libraries with your APK.
#
# In this top level CMakeLists.txt, ${CMAKE_PROJECT_NAME} is used to define
# the target library name; in the sub-module's CMakeLists.txt, ${PROJECT_NAME}
# is preferred for the same purpose.
#
# In order to load a library into your app from Java/Kotlin, you must call
# System.loadLibrary() and pass the name of the library defined here;
# for GameActivity/NativeActivity derived applications, the same library name must be
# used in the AndroidManifest.xml file.

# 第一个库
# 查找源文件
file(GLOB_RECURSE LEARN01_SOURCES CONFIGURE_DEPENDS
        "src/learn01/*.cpp"
        "src/learn01/*.c"
)
add_library(${CMAKE_PROJECT_NAME} SHARED ${LEARN01_SOURCES})

# 设置头文件包含路径
target_include_directories(${CMAKE_PROJECT_NAME}
        PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/learn01
        PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/logging
)

# Specifies libraries CMake should link to your target library. You
# can link libraries from various origins, such as libraries defined in this
# build script, prebuilt third-party libraries, or Android system libraries.
target_link_libraries(${CMAKE_PROJECT_NAME}
    # List libraries link to the target library
    android
    log
    OpenSLES
)

# 新增第二个库 (openslLearn2)
file(GLOB_RECURSE LEARN02_SOURCES CONFIGURE_DEPENDS
        "src/learn02/*.cpp"
        "src/learn02/*.c"
        "src/sqlite/*.cpp"
        "src/sqlite/*.c"
)
set(LIBRARY_NAME2 ${CMAKE_PROJECT_NAME}2)
message("LIBRARY_NAME2: ${LIBRARY_NAME2}")
add_library(${LIBRARY_NAME2} SHARED ${LEARN02_SOURCES})  # 使用不同源文件
target_include_directories(${LIBRARY_NAME2}
        PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/sqlite
        PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/learn02
        PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/logging
)
find_package (oboe REQUIRED CONFIG)
target_link_libraries(${LIBRARY_NAME2}
        android
        log
        aaudio
        oboe::oboe
        camera2ndk
        mediandk
)

头文件

//
// Created by 29051 on 2025/10/25.
//

#ifndef OPENSL_LEARN_CAMERA_HPP
#define OPENSL_LEARN_CAMERA_HPP

extern "C" {
#include <camera/NdkCameraManager.h>
#include <media/NdkImageReader.h>
}

#include <string>
#include <fstream>

#include "logging.hpp"

class NDKCamera {
private:
    int mWidth;
    int wHeight;
    ACameraManager *aCameraManager = nullptr;
    ACameraDevice *device = nullptr;
    ACameraCaptureSession *session = nullptr;
    AImageReader *aImageReader = nullptr;
    ACaptureSessionOutputContainer *aCaptureSessionOutputContainer = nullptr;
    ACaptureSessionOutput *sessionOutput = nullptr;
    std::string yuvPath;
    std::ofstream *yuvStream = nullptr;
public:
    NDKCamera(int width, int height, std::string yuvPath);
    ~NDKCamera();
    /**
     * Capabilities 功能
     */
    void printCameraCapabilities(const char * cameraId);
};

#endif //OPENSL_LEARN_CAMERA_HPP

源文件

//
// Created by 29051 on 2025/10/25.
//
#include "NDKCamera.hpp"

#include <utility>

const char * const TAG = "NDKCamera";

/**
 * CameraManager → CameraService → Camera HAL v3 → Sensor/Driver
 * @param width
 * @param height
 */
NDKCamera::NDKCamera(int width, int height, std::string yuvPath) : mWidth(width), wHeight(height), yuvPath(std::move(yuvPath)) {
    logger::info(TAG, "width: %d, height: %d, yuvPath: %s", this -> mWidth, this -> wHeight, this -> yuvPath.c_str());
    this->yuvStream = new std::ofstream(this->yuvPath, std::ios::binary);
    if (!this->yuvStream->is_open()){
        logger::error(TAG, "文件打开失败...");
        return;
    }
    aCameraManager = ACameraManager_create();
    if (aCameraManager == nullptr){
        logger::error(TAG, "aCameraManager is null");
        return;
    }
    ACameraIdList *cameraIdList = nullptr;
    camera_status_t status = ACameraManager_getCameraIdList(aCameraManager, &cameraIdList);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 getCameraIdList 失败");
        return;
    }
    if (cameraIdList->numCameras <= 0){
        logger::error(TAG, "此设备没有摄像头");
        return;
    }
    for(int i = 0; i < cameraIdList->numCameras; i ++ ){
        logger::info(TAG, "index: %d, cameraId: %s", i, cameraIdList->cameraIds[i]);
    }
    const char* cameraId = cameraIdList->cameraIds[1];
    this->printCameraCapabilities(cameraId);
    ACameraDevice_StateCallbacks deviceStateCallbacks = {
            .context = nullptr,
            .onDisconnected = [](void*, ACameraDevice* aCameraDevice) -> void {},
            .onError = [](void*, ACameraDevice* aCameraDevice, int errorCode) -> void {},
    };
    status = ACameraManager_openCamera(aCameraManager, cameraId, &deviceStateCallbacks, &device);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 camera 失败");
        return;
    }
    media_status_t mediaStatus = AImageReader_new(width, height, AIMAGE_FORMAT_YUV_420_888, 4, &aImageReader);
    if (mediaStatus != AMEDIA_OK){
        logger::error(TAG, "AImageReader_new 失败");
        return;
    }
    AImageReader_ImageListener imageListener = {
            .context = this,
            .onImageAvailable = [](void* context, AImageReader* reader) -> void {
                AImage *image = nullptr;
                media_status_t mediaStatus = AImageReader_acquireNextImage(reader, &image);
                if (mediaStatus != AMEDIA_OK || image == nullptr){
                    logger::error(TAG, "获取当前yuv帧失败");
                    AImage_delete(image);
                    return;
                }
                int32_t width = 0, height = 0;
                mediaStatus = AImage_getWidth(image, &width);
                if (mediaStatus != AMEDIA_OK || image == nullptr){
                    logger::error(TAG, "获取当前yuv帧宽度失败");
                    AImage_delete(image);
                    return;
                }
                mediaStatus = AImage_getHeight(image, &height);
                if (mediaStatus != AMEDIA_OK || image == nullptr){
                    logger::error(TAG, "获取当前yuv帧高度失败");
                    AImage_delete(image);
                    return;
                }
                // ==========
                const auto *ndkCamera = reinterpret_cast<NDKCamera*>(context);
                for (int plane = 0; plane < 3; ++plane) {
                    uint8_t* planeData = nullptr;
                    int planeDataLen = 0;
                    if (AImage_getPlaneData(image, plane, &planeData, &planeDataLen) != AMEDIA_OK) {
                        logger::error(TAG, "AImage_getPlaneData failed plane=%d", plane);
                        AImage_delete(image);
                        return;
                    }
                    int rowStride = 0, pixelStride = 0;
                    AImage_getPlaneRowStride(image, plane, &rowStride);
                    AImage_getPlanePixelStride(image, plane, &pixelStride);

                    int planeWidth = (plane == 0) ? width : (width + 1) / 2;
                    int planeHeight = (plane == 0) ? height : (height + 1) / 2;

                    // 按行按 pixelStride 写入,确保是连续的 Y then U then V
                    for (int y = 0; y < planeHeight; ++y) {
                        const uint8_t* rowPtr = planeData + y * rowStride;
                        if (pixelStride == 1) {
                            // 直接写 planeWidth 字节
                            ndkCamera->yuvStream->write(reinterpret_cast<const char*>(rowPtr), planeWidth);
                        } else {
                            // 需要按 pixelStride 抽取
                            for (int x = 0; x < planeWidth; ++x) {
                                ndkCamera->yuvStream->put(rowPtr[x * pixelStride]);
                            }
                        }
                    }
                }
                AImage_delete(image);
                logger::info(TAG, "yuv width: %d, height: %d", width, height);
            },
    };
    AImageReader_setImageListener(aImageReader, &imageListener);
    ANativeWindow* window = nullptr;
    mediaStatus = AImageReader_getWindow(aImageReader, &window);
    if (mediaStatus != AMEDIA_OK){
        logger::error(TAG, "AImageReader_getWindow 失败");
        return;
    }
    ACaptureRequest *request = nullptr;
    status = ACameraDevice_createCaptureRequest(device, TEMPLATE_PREVIEW, &request);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACameraDevice_createCaptureRequest 失败");
        return;
    }
    // 设置帧率范围
    int32_t range[2] = {30, 30}; // 固定 30fps
    ACaptureRequest_setEntry_i32(request,
                                 ACAMERA_CONTROL_AE_TARGET_FPS_RANGE,
                                 2, range);
    ACameraOutputTarget *aCameraOutputTarget = nullptr;
    status = ACameraOutputTarget_create(window, &aCameraOutputTarget);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACameraOutputTarget_create 失败");
        return;
    }
    status = ACaptureRequest_addTarget(request, aCameraOutputTarget);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACaptureRequest_addTarget 失败");
        return;
    }
    ACameraCaptureSession_stateCallbacks sessionStateCallbacks = {
            .context = nullptr,
            .onClosed = [](void* context, ACameraCaptureSession *session) -> void {
                logger::info(TAG, "onClosed...");
            },
            .onReady = [](void* context, ACameraCaptureSession *session) -> void {
                logger::info(TAG, "onReady...");
            },
            .onActive = [](void* context, ACameraCaptureSession *session) -> void {
                logger::info(TAG, "onActive...");
            },
    };
    ACameraCaptureSession_captureCallbacks captureCallbacks = {
            .context = nullptr,
            .onCaptureStarted = [](void* context, ACameraCaptureSession* session,
                                   const ACaptureRequest* request, int64_t timestamp) -> void {
                logger::info(TAG, "onCaptureStarted timestamp: %d", timestamp);
            },
            .onCaptureProgressed = [](void* context, ACameraCaptureSession* session,
                                      ACaptureRequest* request, const ACameraMetadata* result) -> void {
                logger::info(TAG, "onCaptureProgressed...");
            },
            .onCaptureCompleted = [](void* context, ACameraCaptureSession* session,
                                     ACaptureRequest* request, const ACameraMetadata* result) -> void {
                ACameraMetadata_const_entry fpsEntry = {};
                if (ACameraMetadata_getConstEntry(result,
                                                  ACAMERA_CONTROL_AE_TARGET_FPS_RANGE, &fpsEntry) == ACAMERA_OK) {
                    if (fpsEntry.count >= 2) {
                        int32_t minFps = fpsEntry.data.i32[0];
                        int32_t maxFps = fpsEntry.data.i32[1];
                        logger::info(TAG, "onCaptureCompleted 当前帧率范围: [%d, %d]", minFps, maxFps);
                    }
                }
            },
            .onCaptureFailed = [](void* context, ACameraCaptureSession* session,
                                  ACaptureRequest* request, ACameraCaptureFailure* failure) -> void {
                logger::info(TAG, "onCaptureFailed frameNumber: %d, reason: %d, sequenceId: %d, wasImageCaptured: %d", failure->frameNumber, failure->reason, failure->sequenceId, failure->wasImageCaptured);
            },
            .onCaptureSequenceCompleted = [](void* context, ACameraCaptureSession* session,
                                             int sequenceId, int64_t frameNumber) -> void {
                logger::info(TAG, "onCaptureSequenceCompleted sequenceId: %d, frameNumber: %d", sequenceId, frameNumber);
            },
            .onCaptureSequenceAborted = [](void* context, ACameraCaptureSession* session,
                                           int sequenceId) -> void {
                logger::info(TAG, "onCaptureSequenceAborted sequenceId: %d", sequenceId);
            },
            .onCaptureBufferLost = [](void* context, ACameraCaptureSession* session,
                                      ACaptureRequest* request, ACameraWindowType* window, int64_t frameNumber) -> void {
                logger::info(TAG, "onCaptureBufferLost frameNumber: %d", frameNumber);
            },
    };

    status = ACaptureSessionOutputContainer_create(&aCaptureSessionOutputContainer);

    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACaptureSessionOutputContainer_create 失败");
        return;
    }
    status = ACaptureSessionOutput_create(window, &sessionOutput);

    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACaptureSessionOutput_create 失败");
        return;
    }
    status = ACaptureSessionOutputContainer_add(aCaptureSessionOutputContainer, sessionOutput);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACaptureSessionOutputContainer_add 失败");
        return;
    }
    status = ACameraDevice_createCaptureSession(device, aCaptureSessionOutputContainer, &sessionStateCallbacks, &session);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACameraDevice_createCaptureSession 失败");
        return;
    }
#if __ANDROID_API__ >= 33
    ACameraCaptureSession_captureCallbacksV2 captureCallbacksV2 = {
            .context = nullptr,
            .onCaptureStarted = [](void* context, ACameraCaptureSession* session,
                                   const ACaptureRequest* request, int64_t timestamp, int64_t frameNumber) -> void {

            },
            .onCaptureProgressed = [](void* context, ACameraCaptureSession* session,
                                      ACaptureRequest* request, const ACameraMetadata* result) -> void {

            },
            .onCaptureCompleted = [](void* context, ACameraCaptureSession* session,
                                     ACaptureRequest* request, const ACameraMetadata* result) -> void {

            },
            .onCaptureFailed = [](void* context, ACameraCaptureSession* session,
                                  ACaptureRequest* request, ACameraCaptureFailure* failure) -> void {

            },
            .onCaptureSequenceCompleted = [](void* context, ACameraCaptureSession* session,
                                             int sequenceId, int64_t frameNumber) -> void {

            },
            .onCaptureSequenceAborted = [](void* context, ACameraCaptureSession* session,
                                           int sequenceId) -> void {

            },
            .onCaptureBufferLost = [](void* context, ACameraCaptureSession* session,
                                      ACaptureRequest* request, ACameraWindowType* window, int64_t frameNumber) -> void {

            },
    };
    status = ACameraCaptureSession_setRepeatingRequestV2(session, &captureCallbacksV2, 1, &request, nullptr);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACameraCaptureSession_setRepeatingRequestV2 失败");
        return;
    }
#else
    status = ACameraCaptureSession_setRepeatingRequest(session, &captureCallbacks, 1, &request, nullptr);
    if (status != ACAMERA_OK){
        logger::error(TAG, "开启 ACameraCaptureSession_setRepeatingRequest 失败");
        return;
    }
#endif
}
NDKCamera::~NDKCamera() {
    logger::info(TAG, "~NDKCamera...");
    if (this->aImageReader != nullptr){
        AImageReader_delete(this->aImageReader);
    }
    if (session != nullptr){
        ACameraCaptureSession_close(session);
    }
    if (device != nullptr){
        ACameraDevice_close(device);
    }
    if (aCameraManager != nullptr) {
        ACameraManager_delete(aCameraManager);
    }
    if (this->yuvStream != nullptr){
        this->yuvStream->close();
    }
    if (this->aCaptureSessionOutputContainer != nullptr){
        ACaptureSessionOutputContainer_free(this->aCaptureSessionOutputContainer);
    }
    if (this->sessionOutput != nullptr){
        ACaptureSessionOutput_free(this->sessionOutput);
    }
}

void NDKCamera::printCameraCapabilities(const char * const cameraId){
    ACameraMetadata *metadata = nullptr;
    camera_status_t status = ACameraManager_getCameraCharacteristics(this->aCameraManager, cameraId, &metadata);
    if(status != ACAMERA_OK){
        logger::error(TAG, "获取摄像头信息失败");
        return;
    }
    ACameraMetadata_const_entry entry = {};
    if (ACameraMetadata_getConstEntry(metadata, ACAMERA_SCALER_AVAILABLE_STREAM_CONFIGURATIONS, &entry) == ACAMERA_OK){
        logger::info(TAG, "支持的分辨率:");
        for(uint32_t i = 0; i + 3 < entry.count; i += 4){
            int32_t format = entry.data.i32[i + 0];
            int32_t width = entry.data.i32[i + 1];
            int32_t height = entry.data.i32[i + 2];
            int32_t isInput = entry.data.i32[i + 3];
            if (isInput == 0 && format == AIMAGE_FORMAT_YUV_420_888){
                logger::info(TAG, "format: %d, width: %d, height: %d, isInput: %d", format, width, height, isInput);
            }
        }
    }
    if (ACameraMetadata_getConstEntry(metadata, ACAMERA_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES, &entry) == ACAMERA_OK){
        logger::info(TAG, "支持的帧率范围:");
        for (uint32_t i = 0; i + 1 < entry.count; i += 2) {
            logger::info(TAG, "帧率范围: [%d, %d]", entry.data.i32[i], entry.data.i32[i + 1]);
        }
    }
    ACameraMetadata_free(metadata);
}

暴露给Kotlin

extern "C"
JNIEXPORT jlong JNICALL
Java_io_github_opensllearn_utils_Utils_initCamera(JNIEnv *env, jobject, jint width, jint height, jstring pcmPath) {
    NDKCamera *ndkCamera = nullptr;
    try {
        jboolean isCopy = false;
        const char * const pcmPathStr = env->GetStringUTFChars(pcmPath, &isCopy);
        ndkCamera = new NDKCamera(width, height, pcmPathStr);
        if (isCopy){
            env->ReleaseStringUTFChars(pcmPath, pcmPathStr);
        }
    } catch (const std::exception &e) {
        delete ndkCamera;
        ndkCamera = nullptr;
        env->ThrowNew(env->FindClass("java/lang/RuntimeException"), e.what());
    }
    return reinterpret_cast<jlong>(ndkCamera);
}
extern "C"
JNIEXPORT void JNICALL
Java_io_github_opensllearn_utils_Utils_releaseCamera(JNIEnv*, jobject, jlong ptr) {
    const auto* const ndkKCamera = reinterpret_cast<NDKCamera*>(ptr);
    delete ndkKCamera;
}

结束.后续如果想渲染得话可以使用Surface,然后传入Native,使用OpenGL,先将yuv420p转为RGB然后交给OpenGL.不是很复杂.

核心逻辑

for (int plane = 0; plane < 3; ++plane) {
	uint8_t* planeData = nullptr;
	int planeDataLen = 0;
	if (AImage_getPlaneData(image, plane, &planeData, &planeDataLen) != AMEDIA_OK) {
		logger::error(TAG, "AImage_getPlaneData failed plane=%d", plane);
		AImage_delete(image);
		return;
	}
	int rowStride = 0, pixelStride = 0;
	AImage_getPlaneRowStride(image, plane, &rowStride);
	AImage_getPlanePixelStride(image, plane, &pixelStride);

	int planeWidth = (plane == 0) ? width : (width + 1) / 2;
	int planeHeight = (plane == 0) ? height : (height + 1) / 2;

	// 按行按 pixelStride 写入,确保是连续的 Y then U then V
	for (int y = 0; y < planeHeight; ++y) {
		const uint8_t* rowPtr = planeData + y * rowStride;
		if (pixelStride == 1) {
			// 直接写 planeWidth 字节
			ndkCamera->yuvStream->write(reinterpret_cast<const char*>(rowPtr), planeWidth);
		} else {
			// 需要按 pixelStride 抽取
			for (int x = 0; x < planeWidth; ++x) {
				ndkCamera->yuvStream->put(rowPtr[x * pixelStride]);
			}
		}
	}
}

AIMAGE_FORMAT_YUV_420_888: 后面得888表示Y,U,V占一字节.
这个特殊得结果兼任了上文所说得yuv420pyuv420sp.

int32_t planes = 0;
AImage_getNumberOfPlanes(image, &planes);

AImage_getNumberOfPlanes可以获得planes得分量,一般是3(RGB,YUV)或者4(ARGB).
AImage_getPlaneData(image, plane, &planeData, &planeDataLen)获取得是对于得分量的Plane.
planeData是个char类型的二维数组指针,planeDataLen就是把二维数组看成一维数组以后的长度.
比如:


planeData
|
YYYY
YYYY

又比如

planeData
|
UPUP

高潮时刻到了,打起精神! 先整一个AI笑话.

「对着代码改到凌晨,突然灵魂拷问:我费这劲学 YUV 格式、调 AImage 有啥卵用啊?」「要是此刻能冲进来个富婆,啪给我一巴掌说‘别卷这些破玩意了’,再扔张黑卡‘姐带你环球旅行’,我当场能把编译器删了!」
「调试 YUV420P 转码又卡了两小时,盯着屏幕发呆:会这些到底能换几毛钱啊?」「突然脑补一个场景:富婆推门进来,反手给我一巴掌,特霸气地说‘别跟像素较劲了’,然后拽着我就走‘现在就去马尔代夫晒太阳’—— 唉,梦该醒了,继续改 bug 吧。」
「写 AImage 提取数据的代码时,突然摆烂:学这些冷门技术,除了掉头发还有啥用?」「要是有富婆能过来,轻轻扇我一下说‘别学了没用’,再补一句‘我带你去环游世界’,我现在就把项目文件夹拖进回收站,绝不犹豫!」
006bllTKly1frnu8cgiksj305i03sjr8

梦醒了!
AImage_getPlaneRowStride会返回每行的数据量,且会包含无效数据
如下

planeData
|
UPUP

P就是无效数据,所以就需要下一个函数登场.
AImage_getPlanePixelStride代表每行有效像素的距离.
这时候你就需要一个char一个char的写了.
结束.

android.hardware.Camera 源码解析

虽说这玩意已经废弃,但不影响我们解析源码

open->native_setup

frameworks/base/core/jni/android_hardware_Camera.cpp

// connect to camera service
static jint android_hardware_Camera_native_setup(JNIEnv *env, jobject thiz, jobject weak_this,
                                                 jint cameraId, jint rotationOverride,
                                                 jboolean forceSlowJpegMode,
                                                 jobject jClientAttributionParcel,
                                                 jint devicePolicy) {
    AttributionSourceState clientAttribution;
    if (!attributionSourceStateForJavaParcel(env, jClientAttributionParcel,
                                             /* useContextAttributionSource= */ true,
                                             clientAttribution)) {
        return -EACCES;
    }

    int targetSdkVersion = android_get_application_target_sdk_version();
    sp<Camera> camera = Camera::connect(cameraId, targetSdkVersion, rotationOverride,
                                        forceSlowJpegMode, clientAttribution, devicePolicy); // 1
    if (camera == NULL) {
        return -EACCES;
    }
    //...
}

1处代码连接CameraService.
frameworks/av/camera/Camera.cpp

sp<Camera> Camera::connect(int cameraId, int targetSdkVersion, int rotationOverride,
        bool forceSlowJpegMode, const AttributionSourceState& clientAttribution,
        int32_t devicePolicy)
{
    return CameraBaseT::connect(cameraId, targetSdkVersion, rotationOverride,
            forceSlowJpegMode, clientAttribution, devicePolicy);
}

frameworks/av/camera/CameraBase.cpp

template <typename TCam, typename TCamTraits>
sp<TCam> CameraBase<TCam, TCamTraits>::connect(int cameraId,
                                               int targetSdkVersion, int rotationOverride,
                                               bool forceSlowJpegMode,
                                               const AttributionSourceState& clientAttribution,
                                               int32_t devicePolicy)
{
    ALOGV("%s: connect", __FUNCTION__);
    sp<TCam> c = new TCam(cameraId);
    sp<TCamCallbacks> cl = c;
    const sp<::android::hardware::ICameraService> cs = getCameraService();

    binder::Status ret;
    if (cs != nullptr) {
        TCamConnectService fnConnectService = TCamTraits::fnConnectService; // 1
        ALOGI("Connect camera (legacy API) - rotationOverride %d, forceSlowJpegMode %d",
                rotationOverride, forceSlowJpegMode);
        ret = (cs.get()->*fnConnectService)(cl, cameraId, targetSdkVersion,
                rotationOverride, forceSlowJpegMode, clientAttribution, devicePolicy,
                /*out*/ &c->mCamera);
    }
    if (ret.isOk() && c->mCamera != nullptr) {
        IInterface::asBinder(c->mCamera)->linkToDeath(c);
        c->mStatus = NO_ERROR;
    } else {
        ALOGW("An error occurred while connecting to camera %d: %s", cameraId,
                (cs == nullptr) ? "Service not available" : ret.toString8().c_str());
        c.clear();
    }
    return c;
}
// establish binder interface to camera service
namespace {
    sp<::android::hardware::ICameraService> gCameraService;
    const char*               kCameraServiceName      = "media.camera";
    // ...
}
template <typename TCam, typename TCamTraits>
const sp<::android::hardware::ICameraService> CameraBase<TCam, TCamTraits>::getCameraService()
{
    Mutex::Autolock _l(gLock);
    if (gCameraService.get() == 0) {
        if (CameraUtils::isCameraServiceDisabled()) {
            return gCameraService;
        }

        sp<IServiceManager> sm = defaultServiceManager();
        sp<IBinder> binder;
        binder = sm->waitForService(toString16(kCameraServiceName));
        if (binder == nullptr) {
            return nullptr;
        }
        if (gDeathNotifier == NULL) {
            gDeathNotifier = new DeathNotifier();
        }
        binder->linkToDeath(gDeathNotifier);
        gCameraService = interface_cast<::android::hardware::ICameraService>(binder);
    }
    ALOGE_IF(gCameraService == 0, "no CameraService!?");
    return gCameraService;
}

代码1处:
frameworks/av/camera/Camera.cpp

CameraTraits<Camera>::TCamConnectService CameraTraits<Camera>::fnConnectService =
        &::android::hardware::ICameraService::connect;

这里实现了BpCameraServiceBnCameraServiceIPC调用。CameraService实现了BnCameraService.
frameworks/av/services/camera/libcameraservice/CameraService.h

class CameraService :
    public BinderService<CameraService>,
    public virtual ::android::hardware::BnCameraService,
    public virtual IBinder::DeathRecipient,
    public virtual CameraProviderManager::StatusListener,
    public virtual IServiceManager::LocalRegistrationCallback,
    public AttributionAndPermissionUtilsEncapsulator
{
    friend class BinderService<CameraService>;
    friend class CameraOfflineSessionClient;
    // ...
}

frameworks/av/services/camera/libcameraservice/CameraService.cpp

Status CameraService::connect(
        const sp<ICameraClient>& cameraClient,
        int api1CameraId,
        int targetSdkVersion,
        int rotationOverride,
        bool forceSlowJpegMode,
        const AttributionSourceState& clientAttribution,
        int32_t devicePolicy,
        /*out*/
        sp<ICamera>* device) {
    ATRACE_CALL();
    Status ret = Status::ok();

    std::string cameraIdStr =
            cameraIdIntToStr(api1CameraId, clientAttribution.deviceId, devicePolicy);
    if (cameraIdStr.empty()) {
        std::string msg = fmt::sprintf("Camera %d: Invalid camera id for device id %d",
                api1CameraId, clientAttribution.deviceId);
        ALOGE("%s: %s", __FUNCTION__, msg.c_str());
        return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.c_str());
    }

    std::string clientPackageNameMaybe = clientAttribution.packageName.value_or("");
    bool isNonSystemNdk = clientPackageNameMaybe.size() == 0;

    AttributionSourceState resolvedClientAttribution(clientAttribution);
    ret = resolveAttributionSource(resolvedClientAttribution, __FUNCTION__, cameraIdStr);
    if (!ret.isOk()) {
        logRejected(cameraIdStr, getCallingPid(),
                    clientAttribution.packageName.value_or(kUnknownPackageName),
                    toStdString(ret.toString8()));
        return ret;
    }

    const int clientPid = resolvedClientAttribution.pid;
    const int clientUid = resolvedClientAttribution.uid;
    const std::string& clientPackageName = *resolvedClientAttribution.packageName;

    logConnectionAttempt(clientPid, clientPackageName, cameraIdStr, API_1);

    sp<Client> client = nullptr;
    ret = connectHelper<ICameraClient, Client>(
            cameraClient, cameraIdStr, api1CameraId, resolvedClientAttribution,
            /*systemNativeClient*/ false, API_1,
            /*shimUpdateOnly*/ false, /*oomScoreOffset*/ 0, targetSdkVersion, rotationOverride,
            forceSlowJpegMode, cameraIdStr, isNonSystemNdk, /*sharedMode*/false,
            /*isVendorClient*/ false, /*out*/ client); // 1

    if (!ret.isOk()) {
        logRejected(cameraIdStr, getCallingPid(),
                    clientAttribution.packageName.value_or(kUnknownPackageName),
                    toStdString(ret.toString8()));
        return ret;
    }

    *device = client;

    const sp<IServiceManager> sm(defaultServiceManager());
    const auto& mActivityManager = getActivityManager();
    if (mActivityManager) {
        mActivityManager->logFgsApiBegin(LOG_FGS_CAMERA_API,
            getCallingUid(),
            getCallingPid());
    }

    return ret;
}
template <class CALLBACK, class CLIENT>
Status CameraService::connectHelper(const sp<CALLBACK>& cameraCb, const std::string& cameraId,
                                    int api1CameraId,
                                    const AttributionSourceState& clientAttribution,
                                    bool systemNativeClient, apiLevel effectiveApiLevel,
                                    bool shimUpdateOnly, int oomScoreOffset, int targetSdkVersion,
                                    int rotationOverride, bool forceSlowJpegMode,
                                    const std::string& originalCameraId, bool isNonSystemNdk,
                                    bool sharedMode, bool isVendorClient,
                                    /*out*/ sp<CLIENT>& device) {
    binder::Status ret = binder::Status::ok();

    nsecs_t openTimeNs = systemTime();

    sp<CLIENT> client = nullptr;
    int facing = -1;
    int orientation = 0;

    const std::string clientPackageName =
            clientAttribution.packageName.value_or(kUnknownPackageName);

    {
        // Acquire mServiceLock and prevent other clients from connecting
        std::unique_ptr<AutoConditionLock> lock =
                AutoConditionLock::waitAndAcquire(mServiceLockWrapper, DEFAULT_CONNECT_TIMEOUT_NS);

        if (lock == nullptr) {
            ALOGE("CameraService::connect (PID %d) rejected (too many other clients connecting).",
                  clientAttribution.pid);
            return STATUS_ERROR_FMT(
                    ERROR_MAX_CAMERAS_IN_USE,
                    "Cannot open camera %s for \"%s\" (PID %d): Too many other clients connecting",
                    cameraId.c_str(), clientPackageName.c_str(), clientAttribution.pid);
        }

        // Enforce client permissions and do basic validity checks
        if (!(ret = validateConnectLocked(cameraId, clientAttribution, sharedMode)).isOk()) {
            return ret;
        }

        // Check the shim parameters after acquiring lock, if they have already been updated and
        // we were doing a shim update, return immediately
        if (shimUpdateOnly) {
            auto cameraState = getCameraState(cameraId);
            if (cameraState != nullptr) {
                if (!cameraState->getShimParams().isEmpty()) return ret;
            }
        }

        status_t err;

        sp<BasicClient> clientTmp = nullptr;
        std::shared_ptr<resource_policy::ClientDescriptor<std::string, sp<BasicClient>>> partial;
        if ((err = handleEvictionsLocked(
                     cameraId, clientAttribution.pid, effectiveApiLevel,
                     IInterface::asBinder(cameraCb),
                     clientAttribution.packageName.value_or(kUnknownPackageName), oomScoreOffset,
                     systemNativeClient, sharedMode, /*out*/ &clientTmp,
                     /*out*/ &partial)) != NO_ERROR) {
            switch (err) {
                case -ENODEV:
                    return STATUS_ERROR_FMT(ERROR_DISCONNECTED,
                            "No camera device with ID \"%s\" currently available",
                            cameraId.c_str());
                case -EBUSY:
                    return STATUS_ERROR_FMT(ERROR_CAMERA_IN_USE,
                            "Higher-priority client using camera, ID \"%s\" currently unavailable",
                            cameraId.c_str());
                case -EUSERS:
                    return STATUS_ERROR_FMT(ERROR_MAX_CAMERAS_IN_USE,
                            "Too many cameras already open, cannot open camera \"%s\"",
                            cameraId.c_str());
                default:
                    return STATUS_ERROR_FMT(ERROR_INVALID_OPERATION,
                            "Unexpected error %s (%d) opening camera \"%s\"",
                            strerror(-err), err, cameraId.c_str());
            }
        }

        if (clientTmp.get() != nullptr) {
            // Handle special case for API1 MediaRecorder where the existing client is returned
            device = static_cast<CLIENT*>(clientTmp.get());
            return ret;
        }

        // give flashlight a chance to close devices if necessary.
        mFlashlight->prepareDeviceOpen(cameraId);

        int portraitRotation;
        auto deviceVersionAndTransport =
                getDeviceVersion(cameraId, rotationOverride, /*out*/&portraitRotation,
                        /*out*/&facing, /*out*/&orientation);
        if (facing == -1) {
            ALOGE("%s: Unable to get camera device \"%s\"  facing", __FUNCTION__, cameraId.c_str());
            return STATUS_ERROR_FMT(ERROR_INVALID_OPERATION,
                    "Unable to get camera device \"%s\" facing", cameraId.c_str());
        }

        sp<BasicClient> tmp = nullptr;
        bool overrideForPerfClass = SessionConfigurationUtils::targetPerfClassPrimaryCamera(
                mPerfClassPrimaryCameraIds, cameraId, targetSdkVersion);

        // Only use passed in clientPid to check permission. Use calling PID as the client PID
        // that's connected to camera service directly.
        if (!(ret = makeClient(this, cameraCb, clientAttribution, getCallingPid(),
                               systemNativeClient, cameraId, api1CameraId, facing, orientation,
                               getpid(), deviceVersionAndTransport, effectiveApiLevel,
                               overrideForPerfClass, rotationOverride, forceSlowJpegMode,
                               originalCameraId, sharedMode, isVendorClient,
                               /*out*/ &tmp))
                     .isOk()) { // 2
            return ret;
        }
        client = static_cast<CLIENT*>(tmp.get());

        LOG_ALWAYS_FATAL_IF(client.get() == nullptr, "%s: CameraService in invalid state",
                __FUNCTION__);

        std::string monitorTags = isClientWatched(client.get()) ? mMonitorTags : std::string();
        err = client->initialize(mCameraProviderManager, monitorTags); // 3
        if (err != OK) {
            ALOGE("%s: Could not initialize client from HAL.", __FUNCTION__);
            // Errors could be from the HAL module open call or from AppOpsManager
            mServiceLock.unlock();
            client->disconnect();
            mServiceLock.lock();
            switch(err) {
                case BAD_VALUE:
                    return STATUS_ERROR_FMT(ERROR_ILLEGAL_ARGUMENT,
                            "Illegal argument to HAL module for camera \"%s\"", cameraId.c_str());
                case -EBUSY:
                    return STATUS_ERROR_FMT(ERROR_CAMERA_IN_USE,
                            "Camera \"%s\" is already open", cameraId.c_str());
                case -EUSERS:
                    return STATUS_ERROR_FMT(ERROR_MAX_CAMERAS_IN_USE,
                            "Too many cameras already open, cannot open camera \"%s\"",
                            cameraId.c_str());
                case PERMISSION_DENIED:
                    return STATUS_ERROR_FMT(ERROR_PERMISSION_DENIED,
                            "No permission to open camera \"%s\"", cameraId.c_str());
                case -EACCES:
                    return STATUS_ERROR_FMT(ERROR_DISABLED,
                            "Camera \"%s\" disabled by policy", cameraId.c_str());
                case -ENODEV:
                default:
                    return STATUS_ERROR_FMT(ERROR_INVALID_OPERATION,
                            "Failed to initialize camera \"%s\": %s (%d)", cameraId.c_str(),
                            strerror(-err), err);
            }
        }

        // Update shim paremeters for legacy clients
        if (effectiveApiLevel == API_1) {
            // Assume we have always received a Client subclass for API1
            sp<Client> shimClient = reinterpret_cast<Client*>(client.get());
            String8 rawParams = shimClient->getParameters();
            CameraParameters params(rawParams);

            auto cameraState = getCameraState(cameraId);
            if (cameraState != nullptr) {
                cameraState->setShimParams(params);
            } else {
                ALOGE("%s: Cannot update shim parameters for camera %s, no such device exists.",
                        __FUNCTION__, cameraId.c_str());
            }
        }

        // Enable/disable camera service watchdog
        client->setCameraServiceWatchdog(mCameraServiceWatchdogEnabled);

        CameraMetadata chars;
        bool rotateAndCropSupported = true;
        err = mCameraProviderManager->getCameraCharacteristics(cameraId, overrideForPerfClass,
                &chars, rotationOverride);
        if (err == OK) {
            auto availableRotateCropEntry = chars.find(
                    ANDROID_SCALER_AVAILABLE_ROTATE_AND_CROP_MODES);
            if (availableRotateCropEntry.count <= 1) {
                rotateAndCropSupported = false;
            }
        } else {
            ALOGE("%s: Unable to query static metadata for camera %s: %s (%d)", __FUNCTION__,
                    cameraId.c_str(), strerror(-err), err);
        }

        if (rotateAndCropSupported) {
            // Set rotate-and-crop override behavior
            if (mOverrideRotateAndCropMode != ANDROID_SCALER_ROTATE_AND_CROP_AUTO) {
                client->setRotateAndCropOverride(mOverrideRotateAndCropMode);
            } else if (rotationOverride != hardware::ICameraService::ROTATION_OVERRIDE_NONE &&
                    portraitRotation != 0) {
                uint8_t rotateAndCropMode = ANDROID_SCALER_ROTATE_AND_CROP_AUTO;
                switch (portraitRotation) {
                    case 90:
                        rotateAndCropMode = ANDROID_SCALER_ROTATE_AND_CROP_90;
                        break;
                    case 180:
                        rotateAndCropMode = ANDROID_SCALER_ROTATE_AND_CROP_180;
                        break;
                    case 270:
                        rotateAndCropMode = ANDROID_SCALER_ROTATE_AND_CROP_270;
                        break;
                    default:
                        ALOGE("Unexpected portrait rotation: %d", portraitRotation);
                        break;
                }
                // Here we're communicating to the client the chosen rotate
                // and crop mode to send to the HAL
                client->setRotateAndCropOverride(rotateAndCropMode);
            } else {
                client->setRotateAndCropOverride(
                        mCameraServiceProxyWrapper->getRotateAndCropOverride(
                                clientPackageName, facing,
                                multiuser_get_user_id(clientAttribution.uid)));
            }
        }

        bool autoframingSupported = true;
        auto availableAutoframingEntry = chars.find(ANDROID_CONTROL_AUTOFRAMING_AVAILABLE);
        if ((availableAutoframingEntry.count == 1) && (availableAutoframingEntry.data.u8[0] ==
                    ANDROID_CONTROL_AUTOFRAMING_AVAILABLE_FALSE)) {
            autoframingSupported = false;
        }

        if (autoframingSupported) {
            // Set autoframing override behaviour
            if (mOverrideAutoframingMode != ANDROID_CONTROL_AUTOFRAMING_AUTO) {
                client->setAutoframingOverride(mOverrideAutoframingMode);
            } else {
                client->setAutoframingOverride(
                    mCameraServiceProxyWrapper->getAutoframingOverride(
                        clientPackageName));
            }
        }

        bool isCameraPrivacyEnabled;
        if (flags::camera_privacy_allowlist()) {
            // Set camera muting behavior.
            isCameraPrivacyEnabled =
                    this->isCameraPrivacyEnabled(toString16(client->getPackageName()), cameraId,
                                                 clientAttribution.pid, clientAttribution.uid);
        } else {
            isCameraPrivacyEnabled =
                    mSensorPrivacyPolicy->isCameraPrivacyEnabled();
        }

        if (client->supportsCameraMute()) {
            client->setCameraMute(
                    mOverrideCameraMuteMode || isCameraPrivacyEnabled);
        } else if (isCameraPrivacyEnabled) {
            // no camera mute supported, but privacy is on! => disconnect
            ALOGI("Camera mute not supported for package: %s, camera id: %s",
                    client->getPackageName().c_str(), cameraId.c_str());
            // Do not hold mServiceLock while disconnecting clients, but
            // retain the condition blocking other clients from connecting
            // in mServiceLockWrapper if held.
            mServiceLock.unlock();
            // Clear caller identity temporarily so client disconnect PID
            // checks work correctly
            int64_t token = clearCallingIdentity();
            // Note AppOp to trigger the "Unblock" dialog
            client->noteAppOp();
            client->disconnect();
            restoreCallingIdentity(token);
            // Reacquire mServiceLock
            mServiceLock.lock();

            return STATUS_ERROR_FMT(ERROR_DISABLED,
                    "Camera \"%s\" disabled due to camera mute", cameraId.c_str());
        }

        if (shimUpdateOnly) {
            // If only updating legacy shim parameters, immediately disconnect client
            mServiceLock.unlock();
            client->disconnect();
            mServiceLock.lock();
        } else {
            // Otherwise, add client to active clients list
            finishConnectLocked(client, partial, oomScoreOffset, systemNativeClient);
        }

        client->setImageDumpMask(mImageDumpMask);
        client->setStreamUseCaseOverrides(mStreamUseCaseOverrides);
        client->setZoomOverride(mZoomOverrideValue);
    } // lock is destroyed, allow further connect calls

    // Important: release the mutex here so the client can call back into the service from its
    // destructor (can be at the end of the call)
    device = client;

    int32_t openLatencyMs = ns2ms(systemTime() - openTimeNs);
    mCameraServiceProxyWrapper->logOpen(cameraId, facing, clientPackageName,
            effectiveApiLevel, isNonSystemNdk, openLatencyMs);

    {
        Mutex::Autolock lock(mInjectionParametersLock);
        if (cameraId == mInjectionInternalCamId && mInjectionInitPending) {
            mInjectionInitPending = false;
            status_t res = NO_ERROR;
            auto clientDescriptor = mActiveClientManager.get(mInjectionInternalCamId);
            if (clientDescriptor != nullptr) {
                sp<BasicClient> clientSp = clientDescriptor->getValue();
                res = checkIfInjectionCameraIsPresent(mInjectionExternalCamId, clientSp);
                if(res != OK) {
                    return STATUS_ERROR_FMT(ERROR_DISCONNECTED,
                            "No camera device with ID \"%s\" currently available",
                            mInjectionExternalCamId.c_str());
                }
                res = clientSp->injectCamera(mInjectionExternalCamId, mCameraProviderManager);
                if (res != OK) {
                    mInjectionStatusListener->notifyInjectionError(mInjectionExternalCamId, res);
                }
            } else {
                ALOGE("%s: Internal camera ID = %s 's client does not exist!",
                        __FUNCTION__, mInjectionInternalCamId.c_str());
                res = NO_INIT;
                mInjectionStatusListener->notifyInjectionError(mInjectionExternalCamId, res);
            }
        }
    }

    return ret;
}
Status CameraService::makeClient(
        const sp<CameraService>& cameraService, const sp<IInterface>& cameraCb,
        const AttributionSourceState& clientAttribution, int callingPid, bool systemNativeClient,
        const std::string& cameraId, int api1CameraId, int facing, int sensorOrientation,
        int servicePid, std::pair<int, IPCTransport> deviceVersionAndTransport,
        apiLevel effectiveApiLevel, bool overrideForPerfClass, int rotationOverride,
        bool forceSlowJpegMode, const std::string& originalCameraId, bool sharedMode,
        bool isVendorClient,
        /*out*/sp<BasicClient>* client) {
    // For HIDL devices
    if (deviceVersionAndTransport.second == IPCTransport::HIDL) {
        // Create CameraClient based on device version reported by the HAL.
        int deviceVersion = deviceVersionAndTransport.first;
        switch(deviceVersion) {
            case CAMERA_DEVICE_API_VERSION_1_0:
                ALOGE("Camera using old HAL version: %d", deviceVersion);
                return STATUS_ERROR_FMT(ERROR_DEPRECATED_HAL,
                        "Camera device \"%s\" HAL version %d no longer supported",
                        cameraId.c_str(), deviceVersion);
                break;
            case CAMERA_DEVICE_API_VERSION_3_0:
            case CAMERA_DEVICE_API_VERSION_3_1:
            case CAMERA_DEVICE_API_VERSION_3_2:
            case CAMERA_DEVICE_API_VERSION_3_3:
            case CAMERA_DEVICE_API_VERSION_3_4:
            case CAMERA_DEVICE_API_VERSION_3_5:
            case CAMERA_DEVICE_API_VERSION_3_6:
            case CAMERA_DEVICE_API_VERSION_3_7:
                break;
            default:
                // Should not be reachable
                ALOGE("Unknown camera device HAL version: %d", deviceVersion);
                return STATUS_ERROR_FMT(ERROR_INVALID_OPERATION,
                        "Camera device \"%s\" has unknown HAL version %d",
                        cameraId.c_str(), deviceVersion);
        }
    }
    if (effectiveApiLevel == API_1) { // Camera1 API route
        sp<ICameraClient> tmp = static_cast<ICameraClient*>(cameraCb.get());
        *client = new Camera2Client(cameraService, tmp, cameraService->mCameraServiceProxyWrapper,
                                    cameraService->mAttributionAndPermissionUtils,
                                    clientAttribution, callingPid, cameraId, api1CameraId, facing,
                                    sensorOrientation, servicePid, overrideForPerfClass,
                                    rotationOverride, forceSlowJpegMode, /*sharedMode*/false);
        ALOGI("%s: Camera1 API (legacy), rotationOverride %d, forceSlowJpegMode %d",
                __FUNCTION__, rotationOverride, forceSlowJpegMode);
    } else { // Camera2 API route
        sp<hardware::camera2::ICameraDeviceCallbacks> tmp =
                static_cast<hardware::camera2::ICameraDeviceCallbacks*>(cameraCb.get());
        *client = new CameraDeviceClient(
                cameraService, tmp, cameraService->mCameraServiceProxyWrapper,
                cameraService->mAttributionAndPermissionUtils, clientAttribution, callingPid,
                systemNativeClient, cameraId, facing, sensorOrientation, servicePid,
                overrideForPerfClass, rotationOverride, originalCameraId, sharedMode,
                isVendorClient);
        ALOGI("%s: Camera2 API, rotationOverride %d", __FUNCTION__, rotationOverride);
    }
    return Status::ok();
}

frameworks/av/services/camera/libcameraservice/api2/CameraDeviceClient.cpp
CameraDeviceClient的构造函数和initialize函数如下:

CameraDeviceClient::CameraDeviceClient(
        const sp<CameraService>& cameraService,
        const sp<hardware::camera2::ICameraDeviceCallbacks>& remoteCallback,
        std::shared_ptr<CameraServiceProxyWrapper> cameraServiceProxyWrapper,
        std::shared_ptr<AttributionAndPermissionUtils> attributionAndPermissionUtils,
        const AttributionSourceState& clientAttribution, int callingPid, bool systemNativeClient,
        const std::string& cameraId, int cameraFacing, int sensorOrientation, int servicePid,
        bool overrideForPerfClass, int rotationOverride, const std::string& originalCameraId,
        bool sharedMode, bool isVendorClient)
    : Camera2ClientBase(cameraService, remoteCallback, cameraServiceProxyWrapper,
                        attributionAndPermissionUtils, clientAttribution, callingPid,
                        systemNativeClient, cameraId, /*API1 camera ID*/ -1, cameraFacing,
                        sensorOrientation, servicePid, overrideForPerfClass, rotationOverride,
                        sharedMode),
      mInputStream(),
      mStreamingRequestId(REQUEST_ID_NONE),
      mRequestIdCounter(0),
      mOverrideForPerfClass(overrideForPerfClass),
      mOriginalCameraId(originalCameraId),
      mIsVendorClient(isVendorClient) {
    ATRACE_CALL();
    ALOGI("CameraDeviceClient %s: Opened", cameraId.c_str());
}
status_t CameraDeviceClient::initialize(sp<CameraProviderManager> manager,
        const std::string& monitorTags) {
    return initializeImpl(manager, monitorTags);
}

template<typename TProviderPtr>
status_t CameraDeviceClient::initializeImpl(TProviderPtr providerPtr,
        const std::string& monitorTags) {
    ATRACE_CALL();
    status_t res;

    res = Camera2ClientBase::initialize(providerPtr, monitorTags); // 1
    if (res != OK) {
        return res;
    }

    mFrameProcessor = new FrameProcessorBase(mDevice); // 帧数据处理器
    std::string threadName = std::string("CDU-") + mCameraIdStr + "-FrameProc";
    res = mFrameProcessor->run(threadName.c_str());
    if (res != OK) {
        ALOGE("%s: Unable to start frame processor thread: %s (%d)",
                __FUNCTION__, strerror(-res), res);
        return res;
    }

    mFrameProcessor->registerListener(camera2::FrameProcessorBase::FRAME_PROCESSOR_LISTENER_MIN_ID,
                                      camera2::FrameProcessorBase::FRAME_PROCESSOR_LISTENER_MAX_ID,
                                      /*listener*/this,
                                      /*sendPartials*/true);

    const CameraMetadata &deviceInfo = mDevice->info();
    camera_metadata_ro_entry_t physicalKeysEntry = deviceInfo.find(
            ANDROID_REQUEST_AVAILABLE_PHYSICAL_CAMERA_REQUEST_KEYS);
    if (physicalKeysEntry.count > 0) {
        mSupportedPhysicalRequestKeys.insert(mSupportedPhysicalRequestKeys.begin(),
                physicalKeysEntry.data.i32,
                physicalKeysEntry.data.i32 + physicalKeysEntry.count);
    }

    auto entry = deviceInfo.find(ANDROID_REQUEST_AVAILABLE_CAPABILITIES);
    mDynamicProfileMap.emplace(
            ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD,
            ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD);
    if (entry.count > 0) {
        const auto it = std::find(entry.data.u8, entry.data.u8 + entry.count,
                ANDROID_REQUEST_AVAILABLE_CAPABILITIES_DYNAMIC_RANGE_TEN_BIT);
        if (it != entry.data.u8 + entry.count) {
            entry = deviceInfo.find(ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP);
            if (entry.count > 0 || ((entry.count % 3) != 0)) {
                int64_t standardBitmap =
                        ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD;
                for (size_t i = 0; i < entry.count; i += 3) {
                    if (entry.data.i64[i] !=
                            ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD) {
                        mDynamicProfileMap.emplace(entry.data.i64[i], entry.data.i64[i+1]);
                        if ((entry.data.i64[i+1] == 0) || (entry.data.i64[i+1] &
                                ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD)) {
                            standardBitmap |= entry.data.i64[i];
                        }
                    } else {
                        ALOGE("%s: Device %s includes unexpected profile entry: 0x%" PRIx64 "!",
                                __FUNCTION__, mCameraIdStr.c_str(), entry.data.i64[i]);
                    }
                }
                mDynamicProfileMap[ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD] =
                        standardBitmap;
            } else {
                ALOGE("%s: Device %s supports 10-bit output but doesn't include a dynamic range"
                        " profile map!", __FUNCTION__, mCameraIdStr.c_str());
            }
        }
    }

    mProviderManager = providerPtr;
    // Cache physical camera ids corresponding to this device and also the high
    // resolution sensors in this device + physical camera ids
    mProviderManager->isLogicalCamera(mCameraIdStr, &mPhysicalCameraIds);
    if (supportsUltraHighResolutionCapture(mCameraIdStr)) {
        mHighResolutionSensors.insert(mCameraIdStr);
    }
    for (auto &physicalId : mPhysicalCameraIds) {
        if (supportsUltraHighResolutionCapture(physicalId)) {
            mHighResolutionSensors.insert(physicalId);
        }
    }
    int32_t resultMQSize =
            property_get_int32("ro.vendor.camera.res.fmq.size", /*default*/METADATA_QUEUE_SIZE);
    res = CreateMetadataQueue(&mResultMetadataQueue, resultMQSize);
    if (res != OK) {
        ALOGE("%s: Creating result metadata queue failed: %s(%d)", __FUNCTION__,
            strerror(-res), res);
        return res;
    }
    return OK;
}

CameraProviderManager的对象代表HIDL服务的客户端,对应的服务端为ICameraProvider,可以使用lshal命令查看.
frameworks/av/services/camera/libcameraservice/common/Camera2ClientBase.cpp

template <typename TClientBase>
status_t Camera2ClientBase<TClientBase>::initialize(sp<CameraProviderManager> manager,
        const std::string& monitorTags) {
    return initializeImpl(manager, monitorTags);
}

template <typename TClientBase>
template <typename TProviderPtr>
status_t Camera2ClientBase<TClientBase>::initializeImpl(TProviderPtr providerPtr,
        const std::string& monitorTags) {
    ATRACE_CALL();
    ALOGV("%s: Initializing client for camera %s", __FUNCTION__,
          TClientBase::mCameraIdStr.c_str());
    status_t res;

    IPCTransport providerTransport = IPCTransport::INVALID;
    res = providerPtr->getCameraIdIPCTransport(TClientBase::mCameraIdStr,
            &providerTransport);
    if (res != OK) {
        return res;
    }
    switch (providerTransport) {
        case IPCTransport::HIDL:
            mDevice =
                    new HidlCamera3Device(mCameraServiceProxyWrapper,
                            TClientBase::mAttributionAndPermissionUtils,
                            TClientBase::mCameraIdStr, mOverrideForPerfClass,
                            TClientBase::mRotationOverride, mLegacyClient);
            break;
        case IPCTransport::AIDL:
            if (flags::camera_multi_client() && TClientBase::mSharedMode) {
                mDevice = AidlCamera3SharedDevice::getInstance(mCameraServiceProxyWrapper,
                            TClientBase::mAttributionAndPermissionUtils,
                            TClientBase::mCameraIdStr, mOverrideForPerfClass,
                            TClientBase::mRotationOverride, mLegacyClient);
            } else {
                mDevice =
                    new AidlCamera3Device(mCameraServiceProxyWrapper,
                            TClientBase::mAttributionAndPermissionUtils,
                            TClientBase::mCameraIdStr, mOverrideForPerfClass,
                            TClientBase::mRotationOverride, mLegacyClient);
            }
            break;
        default:
            ALOGE("%s Invalid transport for camera id %s", __FUNCTION__,
                    TClientBase::mCameraIdStr.c_str());
            return NO_INIT;
    }
    if (mDevice == NULL) {
        ALOGE("%s: Camera %s: No device connected",
                __FUNCTION__, TClientBase::mCameraIdStr.c_str());
        return NO_INIT;
    }

    // Notify camera opening (check op if check_full_attribution_source_chain flag is off).
    res = TClientBase::notifyCameraOpening();
    if (res != OK) {
        TClientBase::notifyCameraClosing();
        return res;
    }

    res = mDevice->initialize(providerPtr, monitorTags); // 1
    if (res != OK) {
        ALOGE("%s: Camera %s: unable to initialize device: %s (%d)",
                __FUNCTION__, TClientBase::mCameraIdStr.c_str(), strerror(-res), res);
        TClientBase::notifyCameraClosing();
        return res;
    }

    wp<NotificationListener> weakThis(this);
    res = mDevice->setNotifyCallback(weakThis);
    if (res != OK) {
        ALOGE("%s: Camera %s: Unable to set notify callback: %s (%d)",
                __FUNCTION__, TClientBase::mCameraIdStr.c_str(), strerror(-res), res);
        return res;
    }

    return OK;
}

我们这边是通过AIDL而不是HIDL的方式,所以mDeviceAidlCamera3Device.
frameworks/av/services/camera/libcameraservice/device3/aidl/AidlCamera3Device.cpp


AidlCamera3Device::AidlCamera3Device(
        std::shared_ptr<CameraServiceProxyWrapper>& cameraServiceProxyWrapper,
        std::shared_ptr<AttributionAndPermissionUtils> attributionAndPermissionUtils,
        const std::string& id, bool overrideForPerfClass, int rotationOverride,
        bool legacyClient) :
        Camera3Device(cameraServiceProxyWrapper, attributionAndPermissionUtils, id,
                overrideForPerfClass, rotationOverride, legacyClient) {
    mCallbacks = ndk::SharedRefBase::make<AidlCameraDeviceCallbacks>(this);
}

status_t AidlCamera3Device::initialize(sp<CameraProviderManager> manager,
        const std::string& monitorTags) {
    ATRACE_CALL();
    Mutex::Autolock il(mInterfaceLock);
    Mutex::Autolock l(mLock);

    ALOGV("%s: Initializing AIDL device for camera %s", __FUNCTION__, mId.c_str());
    if (mStatus != STATUS_UNINITIALIZED) {
        CLOGE("Already initialized!");
        return INVALID_OPERATION;
    }
    if (manager == nullptr) return INVALID_OPERATION;

    std::shared_ptr<camera::device::ICameraDeviceSession> session;
    ATRACE_BEGIN("CameraHal::openSession");
    status_t res = manager->openAidlSession(mId, mCallbacks,
            /*out*/ &session); // 1
    ATRACE_END();
    if (res != OK) {
        SET_ERR_L("Could not open camera session: %s (%d)", strerror(-res), res);
        return res;
    }
    if (session == nullptr) {
      SET_ERR("Session iface returned is null");
      return INVALID_OPERATION;
    }
    res = manager->getCameraCharacteristics(mId, mOverrideForPerfClass, &mDeviceInfo,
            mRotationOverride); // 2
    if (res != OK) {
        SET_ERR_L("Could not retrieve camera characteristics: %s (%d)", strerror(-res), res);
        session->close();
        return res;
    }
    mSupportNativeZoomRatio = manager->supportNativeZoomRatio(mId);
    mIsCompositeJpegRDisabled = manager->isCompositeJpegRDisabled(mId);

    std::vector<std::string> physicalCameraIds;
    bool isLogical = manager->isLogicalCamera(mId, &physicalCameraIds);
    if (isLogical) {
        for (auto& physicalId : physicalCameraIds) {
            // Do not override characteristics for physical cameras
            res = manager->getCameraCharacteristics(
                    physicalId, /*overrideForPerfClass*/false, &mPhysicalDeviceInfoMap[physicalId],
                    mRotationOverride);
            if (res != OK) {
                SET_ERR_L("Could not retrieve camera %s characteristics: %s (%d)",
                        physicalId.c_str(), strerror(-res), res);
                session->close();
                return res;
            }

            bool usePrecorrectArray =
                    DistortionMapper::isDistortionSupported(mPhysicalDeviceInfoMap[physicalId]);
            if (usePrecorrectArray) {
                res = mDistortionMappers[physicalId].setupStaticInfo(
                        mPhysicalDeviceInfoMap[physicalId]);
                if (res != OK) {
                    SET_ERR_L("Unable to read camera %s's calibration fields for distortion "
                            "correction", physicalId.c_str());
                    session->close();
                    return res;
                }
            }

            mZoomRatioMappers[physicalId] = ZoomRatioMapper(
                    &mPhysicalDeviceInfoMap[physicalId],
                    mSupportNativeZoomRatio, usePrecorrectArray);

            if (SessionConfigurationUtils::supportsUltraHighResolutionCapture(
                    mPhysicalDeviceInfoMap[physicalId])) {
                mUHRCropAndMeteringRegionMappers[physicalId] =
                        UHRCropAndMeteringRegionMapper(mPhysicalDeviceInfoMap[physicalId],
                                usePrecorrectArray);
            }
        }
    }

    std::shared_ptr<AidlRequestMetadataQueue> queue;
    ::aidl::android::hardware::common::fmq::MQDescriptor<
            int8_t, ::aidl::android::hardware::common::fmq::SynchronizedReadWrite> desc;

    ::ndk::ScopedAStatus requestQueueRet = session->getCaptureRequestMetadataQueue(&desc);
    if (!requestQueueRet.isOk()) {
        ALOGE("Transaction error when getting result metadata queue from camera session: %s",
                requestQueueRet.getMessage());
        return AidlProviderInfo::mapToStatusT(requestQueueRet);
    }
    queue = std::make_unique<AidlRequestMetadataQueue>(desc);
    if (!queue->isValid() || queue->availableToWrite() <= 0) {
        ALOGE("HAL returns empty result metadata fmq, not use it");
        queue = nullptr;
        // Don't use resQueue onwards.
    }

    std::unique_ptr<AidlResultMetadataQueue>& resQueue = mResultMetadataQueue;
    ::aidl::android::hardware::common::fmq::MQDescriptor<
        int8_t, ::aidl::android::hardware::common::fmq::SynchronizedReadWrite> resDesc;
    ::ndk::ScopedAStatus resultQueueRet = session->getCaptureResultMetadataQueue(&resDesc);
    if (!resultQueueRet.isOk()) {
        ALOGE("Transaction error when getting result metadata queue from camera session: %s",
                resultQueueRet.getMessage());
        return AidlProviderInfo::mapToStatusT(resultQueueRet);
    }
    resQueue = std::make_unique<AidlResultMetadataQueue>(resDesc);
    if (!resQueue->isValid() || resQueue->availableToWrite() <= 0) {
        ALOGE("HAL returns empty result metadata fmq, not use it");
        resQueue = nullptr;
        // Don't use resQueue onwards.
    }

    camera_metadata_entry bufMgrMode =
            mDeviceInfo.find(ANDROID_INFO_SUPPORTED_BUFFER_MANAGEMENT_VERSION);
    if (bufMgrMode.count > 0) {
        mUseHalBufManager = (bufMgrMode.data.u8[0] ==
                ANDROID_INFO_SUPPORTED_BUFFER_MANAGEMENT_VERSION_HIDL_DEVICE_3_5);
        mSessionHalBufManager = (bufMgrMode.data.u8[0] ==
                ANDROID_INFO_SUPPORTED_BUFFER_MANAGEMENT_VERSION_SESSION_CONFIGURABLE);
    }

    camera_metadata_entry_t capabilities = mDeviceInfo.find(ANDROID_REQUEST_AVAILABLE_CAPABILITIES);
    for (size_t i = 0; i < capabilities.count; i++) {
        uint8_t capability = capabilities.data.u8[i];
        if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_OFFLINE_PROCESSING) {
            mSupportOfflineProcessing = true;
        }
    }

    mInterface =
            new AidlHalInterface(session, queue, mUseHalBufManager, mSupportOfflineProcessing,
                    mSessionHalBufManager);

    std::string providerType;
    mVendorTagId = manager->getProviderTagIdLocked(mId);
    mTagMonitor.initialize(mVendorTagId);
    if (!monitorTags.empty()) {
        mTagMonitor.parseTagsToMonitor(monitorTags);
    }

    for (size_t i = 0; i < capabilities.count; i++) {
        uint8_t capability = capabilities.data.u8[i];
        if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_MONOCHROME) {
            mNeedFixupMonochromeTags = true;
        }
    }

    // batch size limit is applied to the device with camera device version larger than 3.2 which is
    // AIDL v2
    hardware::hidl_version maxVersion{0, 0};
    IPCTransport transport = IPCTransport::AIDL;
    res = manager->getHighestSupportedVersion(mId, &maxVersion, &transport);
    if (res != OK) {
        ALOGE("%s: Error in getting camera device version id: %s (%d)", __FUNCTION__,
              strerror(-res), res);
        return res;
    }
    int deviceVersion = HARDWARE_DEVICE_API_VERSION(maxVersion.get_major(), maxVersion.get_minor());

    mBatchSizeLimitEnabled = (deviceVersion >= CAMERA_DEVICE_API_VERSION_1_2);

    camera_metadata_entry readoutSupported = mDeviceInfo.find(ANDROID_SENSOR_READOUT_TIMESTAMP);
    if (readoutSupported.count == 0) {
        ALOGW("%s: Could not find value corresponding to ANDROID_SENSOR_READOUT_TIMESTAMP. "
              "Assuming true.", __FUNCTION__);
        mSensorReadoutTimestampSupported = true;
    } else {
        mSensorReadoutTimestampSupported =
                readoutSupported.data.u8[0] == ANDROID_SENSOR_READOUT_TIMESTAMP_HARDWARE;
    }

    return initializeCommonLocked(manager); // 3
}

代码1处:manager->openAidlSession(mId, mCallbacks, /*out*/ &session);
frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.cpp


status_t CameraProviderManager::openAidlSession(const std::string &id,
        const std::shared_ptr<
                aidl::android::hardware::camera::device::ICameraDeviceCallback>& callback,
        /*out*/
        std::shared_ptr<aidl::android::hardware::camera::device::ICameraDeviceSession> *session) {

    std::lock_guard<std::mutex> lock(mInterfaceMutex);

    auto deviceInfo = findDeviceInfoLocked(id);
    if (deviceInfo == nullptr) return NAME_NOT_FOUND;

    auto *aidlDeviceInfo3 = static_cast<AidlProviderInfo::AidlDeviceInfo3*>(deviceInfo);
    sp<ProviderInfo> parentProvider = deviceInfo->mParentProvider.promote();
    if (parentProvider == nullptr) {
        return DEAD_OBJECT;
    }
    auto provider =
            static_cast<AidlProviderInfo *>(parentProvider.get())->startProviderInterface();
    if (provider == nullptr) {
        return DEAD_OBJECT;
    }
    std::shared_ptr<HalCameraProvider> halCameraProvider =
            std::make_shared<AidlHalCameraProvider>(provider, provider->descriptor);
    saveRef(DeviceMode::CAMERA, id, halCameraProvider);

    auto interface = aidlDeviceInfo3->startDeviceInterface(); // 1
    if (interface == nullptr) {
        removeRef(DeviceMode::CAMERA, id);
        return DEAD_OBJECT;
    }

    auto ret = interface->open(callback, session); // 2
    if (!ret.isOk()) {
        removeRef(DeviceMode::CAMERA, id);
        ALOGE("%s: Transaction error opening a session for camera device %s: %s",
                __FUNCTION__, id.c_str(), ret.getMessage());
        return AidlProviderInfo::mapToStatusT(ret);
    }
    return OK;
}

查看startDeviceInterface源码
frameworks/av/services/camera/libcameraservice/common/aidl/AidlProviderInfo.cpp


std::shared_ptr<aidl::android::hardware::camera::device::ICameraDevice>
AidlProviderInfo::AidlDeviceInfo3::startDeviceInterface() {
    Mutex::Autolock l(mDeviceAvailableLock);
    std::shared_ptr<camera::device::ICameraDevice> device;
    ATRACE_CALL();
    if (mSavedInterface == nullptr) {
        sp<AidlProviderInfo> parentProvider =
                static_cast<AidlProviderInfo *>(mParentProvider.promote().get());
        if (parentProvider != nullptr) {
            // Wait for lazy HALs to confirm device availability
            if (parentProvider->isExternalLazyHAL() && !mIsDeviceAvailable) {
                ALOGV("%s: Wait for external device to become available %s",
                      __FUNCTION__,
                      mId.c_str());

                auto res = mDeviceAvailableSignal.waitRelative(mDeviceAvailableLock,
                                                         kDeviceAvailableTimeout);
                if (res != OK) {
                    ALOGE("%s: Failed waiting for device to become available",
                          __FUNCTION__);
                    return nullptr;
                }
            }

            device = parentProvider->startDeviceInterface(mName); // 1
        }
    } else {
        device = mSavedInterface;
    }
    return device;
}

std::shared_ptr<camera::device::ICameraDevice>
AidlProviderInfo::startDeviceInterface(const std::string &name) {
    ::ndk::ScopedAStatus status;
    std::shared_ptr<camera::device::ICameraDevice> cameraInterface;
    const std::shared_ptr<ICameraProvider> interface = startProviderInterface(); // 1
    if (interface == nullptr) {
        return nullptr;
    }
    status = interface->getCameraDeviceInterface(name, &cameraInterface); // 2
    if (!status.isOk()) {
        ALOGE("%s: Transaction error trying to obtain interface for camera device %s: %s",
                __FUNCTION__, name.c_str(), status.getMessage());
        return nullptr;
    }
    return cameraInterface;
}

const std::shared_ptr<ICameraProvider> AidlProviderInfo::startProviderInterface() {
    ATRACE_CALL();
    ALOGV("Request to start camera provider: %s", mProviderName.c_str());
    if (mSavedInterface != nullptr) {
        return mSavedInterface;
    }

    if (!kEnableLazyHal) {
        ALOGE("Bad provider state! Should not be here on a non-lazy HAL!");
        return nullptr;
    }

    auto interface = mActiveInterface.lock();
    if (interface != nullptr) {
        ALOGV("Camera provider (%s) already in use. Re-using instance.", mProviderName.c_str());
        return interface;
    }

    // Try to get service without starting
    interface = ICameraProvider::fromBinder(
            ndk::SpAIBinder(AServiceManager_checkService(mProviderName.c_str())));
    if (interface != nullptr) {
        // Service is already running. Cache and return.
        mActiveInterface = interface;
        return interface;
    }

    ALOGV("Camera provider actually needs restart, calling getService(%s)", mProviderName.c_str());
    interface = mManager->mAidlServiceProxy->getService(mProviderName); // 1

    if (interface == nullptr) {
        ALOGE("%s: %s service not started", __FUNCTION__, mProviderName.c_str());
        return nullptr;
    }

    // Set all devices as ENUMERATING, provider should update status
    // to PRESENT after initializing.
    // This avoids failing getCameraDeviceInterface_V3_x before devices
    // are ready.
    for (auto& device : mDevices) {
      device->mIsDeviceAvailable = false;
    }

    interface->setCallback(mCallbacks);
    auto link = AIBinder_linkToDeath(interface->asBinder().get(), mDeathRecipient.get(),
            this);
    if (link != STATUS_OK) {
        ALOGW("%s: Unable to link to provider '%s' death notifications",
                __FUNCTION__, mProviderName.c_str());
        mManager->removeProvider(std::string(mProviderInstance));
        return nullptr;
    }

    // Send current device state
    interface->notifyDeviceStateChange(mDeviceState);
    // Cache interface to return early for future calls.
    mActiveInterface = interface;

    return interface;
}

image
所有实现RefBase的智能指针管理的类,在首次引用时都会回调onFirstRef.
frameworks/av/services/camera/libcameraservice/CameraService.cpp

status_t CameraService::enumerateProviders() {
    status_t res;

    std::vector<std::string> deviceIds;
    std::unordered_map<std::string, std::set<std::string>> unavailPhysicalIds;
    {
        Mutex::Autolock l(mServiceLock);

        if (nullptr == mCameraProviderManager.get()) {
            mCameraProviderManager = new CameraProviderManager();
            res = mCameraProviderManager->initialize(this); // 1
            if (res != OK) {
                ALOGE("%s: Unable to initialize camera provider manager: %s (%d)",
                        __FUNCTION__, strerror(-res), res);
                logServiceError("Unable to initialize camera provider manager",
                        ERROR_DISCONNECTED);
                return res;
            }
        }

        // Setup vendor tags before we call get_camera_info the first time
        // because HAL might need to setup static vendor keys in get_camera_info
        // TODO: maybe put this into CameraProviderManager::initialize()?
        mCameraProviderManager->setUpVendorTags();

        if (nullptr == mFlashlight.get()) {
            mFlashlight = new CameraFlashlight(mCameraProviderManager, this);
        }

        res = mFlashlight->findFlashUnits();
        if (res != OK) {
            ALOGE("Failed to enumerate flash units: %s (%d)", strerror(-res), res);
        }

        deviceIds = mCameraProviderManager->getCameraDeviceIds(&unavailPhysicalIds);
    }

    for (auto& cameraId : deviceIds) {
        if (getCameraState(cameraId) == nullptr) {
            onDeviceStatusChanged(cameraId, CameraDeviceStatus::PRESENT);
        }
        if (unavailPhysicalIds.count(cameraId) > 0) {
            for (const auto& physicalId : unavailPhysicalIds[cameraId]) {
                onDeviceStatusChanged(cameraId, physicalId, CameraDeviceStatus::NOT_PRESENT);
            }
        }
    }

    // Derive primary rear/front cameras, and filter their charactierstics.
    // This needs to be done after all cameras are enumerated and camera ids are sorted.
    if (SessionConfigurationUtils::IS_PERF_CLASS) {
        // Assume internal cameras are advertised from the same
        // provider. If multiple providers are registered at different time,
        // and each provider contains multiple internal color cameras, the current
        // logic may filter the characteristics of more than one front/rear color
        // cameras.
        Mutex::Autolock l(mServiceLock);
        filterSPerfClassCharacteristicsLocked();
    }

    return OK;
}

mCameraProviderManager->initialize(this);使用了默认参数,默认参数在头文件中定义,实例化默认参数在CameraProviderManager.cpp文件
frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.h

class CameraProviderManager : virtual public hidl::manager::V1_0::IServiceNotification,
        public virtual IServiceManager::LocalRegistrationCallback {
public:
    // needs to be made friend strict since HidlProviderInfo needs to inherit
    // from CameraProviderManager::ProviderInfo which isn't a public member.
    friend struct HidlProviderInfo;
    friend struct AidlProviderInfo;
    ~CameraProviderManager();

    // Tiny proxy for the static methods in a HIDL interface that communicate with the hardware
    // service manager, to be replacable in unit tests with a fake.
    struct HidlServiceInteractionProxy {
        virtual bool registerForNotifications(
                const std::string &serviceName,
                const sp<hidl::manager::V1_0::IServiceNotification>
                &notification) = 0;
        // Will not wait for service to start if it's not already running
        virtual sp<hardware::camera::provider::V2_4::ICameraProvider> tryGetService(
                const std::string &serviceName) = 0;
        // Will block for service if it exists but isn't running
        virtual sp<hardware::camera::provider::V2_4::ICameraProvider> getService(
                const std::string &serviceName) = 0;
        virtual hardware::hidl_vec<hardware::hidl_string> listServices() = 0;
        virtual ~HidlServiceInteractionProxy() {}
    };

    // Standard use case - call into the normal generated static methods which invoke
    // the real hardware service manager
    struct HidlServiceInteractionProxyImpl : public HidlServiceInteractionProxy {
        virtual bool registerForNotifications(
                const std::string &serviceName,
                const sp<hidl::manager::V1_0::IServiceNotification>
                &notification) override {
            return hardware::camera::provider::V2_4::ICameraProvider::registerForNotifications(
                    serviceName, notification);
        }
        virtual sp<hardware::camera::provider::V2_4::ICameraProvider> tryGetService(
                const std::string &serviceName) override {
            return hardware::camera::provider::V2_4::ICameraProvider::tryGetService(serviceName);
        }
        virtual sp<hardware::camera::provider::V2_4::ICameraProvider> getService(
                const std::string &serviceName) override {
            return hardware::camera::provider::V2_4::ICameraProvider::getService(serviceName);
        }

        virtual hardware::hidl_vec<hardware::hidl_string> listServices() override;
    };

    // Proxy to inject fake services in test.
    class AidlServiceInteractionProxy {
      public:
        // Returns the Aidl service with the given serviceName. Will wait indefinitely
        // for the service to come up if not running.
        virtual std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
        getService(const std::string& serviceName) = 0;

        // Attempts to get an already running AIDL service of the given serviceName.
        // Returns nullptr immediately if service is not running.
        virtual std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
        tryGetService(const std::string& serviceName) = 0;

        virtual ~AidlServiceInteractionProxy() = default;
    };

    // Standard use case - call into the normal static methods which invoke
    // the real service manager
    class AidlServiceInteractionProxyImpl : public AidlServiceInteractionProxy {
      public:
        virtual std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
        getService(const std::string& serviceName) override;

        virtual std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
        tryGetService(const std::string& serviceName) override;
    };

    /**
     * Listener interface for device/torch status changes
     */
    struct StatusListener : virtual public RefBase {
        ~StatusListener() {}

        virtual void onDeviceStatusChanged(const std::string &cameraId,
                CameraDeviceStatus newStatus) = 0;
        virtual void onDeviceStatusChanged(const std::string &cameraId,
                const std::string &physicalCameraId,
                CameraDeviceStatus newStatus) = 0;
        virtual void onTorchStatusChanged(const std::string &cameraId,
                TorchModeStatus newStatus,
                SystemCameraKind kind) = 0;
        virtual void onTorchStatusChanged(const std::string &cameraId,
                TorchModeStatus newStatus) = 0;
        virtual void onNewProviderRegistered() = 0;
    };

    /**
     * Represents the mode a camera device is currently in
     */
    enum class DeviceMode {
        TORCH,
        CAMERA
    };

    /**
     * Initialize the manager and give it a status listener; optionally accepts a service
     * interaction proxy.
     *
     * The default proxy communicates via the hardware service manager; alternate proxies can be
     * used for testing. The lifetime of the proxy must exceed the lifetime of the manager.
     */
    status_t initialize(wp<StatusListener> listener,
                        HidlServiceInteractionProxy* hidlProxy = &sHidlServiceInteractionProxy,
                        AidlServiceInteractionProxy* aidlProxy = &sAidlServiceInteractionProxy);
	// ...
}

实现在frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.cpp

CameraProviderManager::HidlServiceInteractionProxyImpl
CameraProviderManager::sHidlServiceInteractionProxy{};
CameraProviderManager::AidlServiceInteractionProxyImpl
CameraProviderManager::sAidlServiceInteractionProxy{};
std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
CameraProviderManager::AidlServiceInteractionProxyImpl::getService(
        const std::string& serviceName) {
    using aidl::android::hardware::camera::provider::ICameraProvider;

    AIBinder* binder = nullptr;
    binder = AServiceManager_waitForService(serviceName.c_str()); // "android.hardware.camera.provider.ICameraProvider"

    if (binder == nullptr) {
        ALOGE("%s: AIDL Camera provider HAL '%s' is not actually available, despite waiting "
              "indefinitely?", __FUNCTION__, serviceName.c_str());
        return nullptr;
    }
    std::shared_ptr<ICameraProvider> interface =
            ICameraProvider::fromBinder(ndk::SpAIBinder(binder));

    return interface;
}

status_t CameraProviderManager::tryToAddAidlProvidersLocked() {
    const char * aidlHalServiceDescriptor =
            aidl::android::hardware::camera::provider::ICameraProvider::descriptor;
    auto sm = defaultServiceManager();
    auto aidlProviders = sm->getDeclaredInstances(
            String16(aidlHalServiceDescriptor));

    if (isVirtualCameraHalEnabled()) {
        // Virtual Camera provider is not declared in the VINTF manifest so we
        // manually add it if the binary is present.
        aidlProviders.push_back(String16(kVirtualProviderName.c_str()));
    }

    for (const auto &aidlInstance : aidlProviders) {
        std::string aidlServiceName =
                getFullAidlProviderName(toStdString(aidlInstance));
        auto res = sm->registerForNotifications(String16(aidlServiceName.c_str()), this);
        if (res != OK) {
            ALOGE("%s Unable to register for notifications with AIDL service manager",
                    __FUNCTION__);
            return res;
        }
        addAidlProviderLocked(aidlServiceName);
    }
    return OK;
}

startProviderInterface -> getCameraDeviceInterface;
找到android.hardware.camera.provider.ICameraProvider服务,然后调用getCameraDeviceInterface方法
这里也是IPC通信,找到ICameraProvider.aidl会生成BpCameraProviderBnCameraProvider.
查看谁实现了BnCameraProvider即可,是AidlCameraProvider,它实现了getCameraDeviceInterface方法。
hardware/google/camera/common/hal/aidl_service/aidl_camera_provider.cc


ScopedAStatus AidlCameraProvider::getCameraDeviceInterface(
    const std::string& camera_device_name,
    std::shared_ptr<ICameraDevice>* device) {
  std::unique_ptr<CameraDevice> google_camera_device;
  if (device == nullptr) {
    ALOGE("%s: device is nullptr. ", __FUNCTION__);
    return ScopedAStatus::fromServiceSpecificError(
        static_cast<int32_t>(Status::ILLEGAL_ARGUMENT));
  }

  // Parse camera_device_name.
  std::string camera_id, device_version;

  bool match = ParseDeviceName(camera_device_name, &device_version, &camera_id);
  if (!match) {
    ALOGE("%s: Device name parse fail. ", __FUNCTION__);
    return ScopedAStatus::fromServiceSpecificError(
        static_cast<int32_t>(Status::ILLEGAL_ARGUMENT));
  }

  int camera_id_int = atoi(camera_id.c_str());
  status_t res = google_camera_provider_->CreateCameraDevice(
      camera_id_int, &google_camera_device);
  if (res != OK) {
    ALOGE("%s: Creating CameraDevice failed: %s(%d)", __FUNCTION__,
          strerror(-res), res);
    return aidl_utils::ConvertToAidlReturn(res);
  }

  *device = device::implementation::AidlCameraDevice::Create(
      std::move(google_camera_device)); // 1
  if (*device == nullptr) {
    ALOGE("%s: Creating AidlCameraDevice failed", __FUNCTION__);
    return ScopedAStatus::fromServiceSpecificError(
        static_cast<int32_t>(Status::INTERNAL_ERROR));
  }

#ifdef __ANDROID_APEX__
  available_camera_ids_.erase(camera_id_int);
  if (!camera_device_initialized_ && available_camera_ids_.empty()) {
    camera_device_initialized_ = true;

    std::string ready_property_name = "vendor.camera.hal.ready.count";
    int ready_count = property_get_int32(ready_property_name.c_str(), 0);
    property_set(ready_property_name.c_str(),
                 std::to_string(++ready_count).c_str());
    ALOGI(
        "AidlCameraProvider::getCameraDeviceInterface() first time ready "
        "count: %d ",
        ready_count);
  }
#endif
  return ScopedAStatus::ok();
}

frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.cpp

status_t CameraProviderManager::openAidlSession(const std::string &id,
        const std::shared_ptr<
                aidl::android::hardware::camera::device::ICameraDeviceCallback>& callback,
        /*out*/
        std::shared_ptr<aidl::android::hardware::camera::device::ICameraDeviceSession> *session) {

    std::lock_guard<std::mutex> lock(mInterfaceMutex);

    auto deviceInfo = findDeviceInfoLocked(id);
    if (deviceInfo == nullptr) return NAME_NOT_FOUND;

    auto *aidlDeviceInfo3 = static_cast<AidlProviderInfo::AidlDeviceInfo3*>(deviceInfo);
    sp<ProviderInfo> parentProvider = deviceInfo->mParentProvider.promote();
    if (parentProvider == nullptr) {
        return DEAD_OBJECT;
    }
    auto provider =
            static_cast<AidlProviderInfo *>(parentProvider.get())->startProviderInterface();
    if (provider == nullptr) {
        return DEAD_OBJECT;
    }
    std::shared_ptr<HalCameraProvider> halCameraProvider =
            std::make_shared<AidlHalCameraProvider>(provider, provider->descriptor);
    saveRef(DeviceMode::CAMERA, id, halCameraProvider);

    auto interface = aidlDeviceInfo3->startDeviceInterface();
    if (interface == nullptr) {
        removeRef(DeviceMode::CAMERA, id);
        return DEAD_OBJECT;
    }

    auto ret = interface->open(callback, session); // 1
    if (!ret.isOk()) {
        removeRef(DeviceMode::CAMERA, id);
        ALOGE("%s: Transaction error opening a session for camera device %s: %s",
                __FUNCTION__, id.c_str(), ret.getMessage());
        return AidlProviderInfo::mapToStatusT(ret);
    }
    return OK;
}

综上所述代码1处返回的是
AidlCameraDeviceCreate方法创建的AidlCameraDevice,已经通过AIDL直通HAL.
hardware/google/camera/common/hal/aidl_service/aidl_camera_device.cc


std::shared_ptr<AidlCameraDevice> AidlCameraDevice::Create(
    std::unique_ptr<CameraDevice> google_camera_device) {
  auto device = ndk::SharedRefBase::make<AidlCameraDevice>();
  if (device == nullptr) {
    ALOGE("%s: Cannot create a AidlCameraDevice.", __FUNCTION__);
    return nullptr;
  }

  status_t res = device->Initialize(std::move(google_camera_device));
  if (res != OK) {
    ALOGE("%s: Initializing AidlCameraDevice failed: %s(%d)", __FUNCTION__,
          strerror(-res), res);
    return nullptr;
  }

  return device;
}
status_t AidlCameraDevice::Initialize(
    std::unique_ptr<CameraDevice> google_camera_device) {
  if (google_camera_device == nullptr) {
    ALOGE("%s: google_camera_device is nullptr.", __FUNCTION__);
    return BAD_VALUE;
  }

  camera_id_ = google_camera_device->GetPublicCameraId();
  google_camera_device_ = std::move(google_camera_device);
  aidl_profiler_ = google_camera_hal::AidlProfiler::Create(camera_id_);
  if (aidl_profiler_ == nullptr) {
    ALOGE("%s: Failed to create AidlProfiler.", __FUNCTION__);
    return UNKNOWN_ERROR;
  }
  return OK;
}

继续看AidlCameraDeviceopen方法.
hardware/google/camera/common/hal/aidl_service/aidl_camera_device.cc

ScopedAStatus AidlCameraDevice::open(
    const std::shared_ptr<ICameraDeviceCallback>& callback,
    std::shared_ptr<ICameraDeviceSession>* session_ret) {
  if (session_ret == nullptr) {
    return ScopedAStatus::fromServiceSpecificError(
        static_cast<int32_t>(Status::ILLEGAL_ARGUMENT));
  }
  *session_ret = nullptr;
  auto profiler = aidl_profiler_->MakeScopedProfiler(
      google_camera_hal::EventType::kOpen,
      google_camera_device_->GetProfiler(camera_id_,
                                         aidl_profiler_->GetLatencyFlag()),
      google_camera_device_->GetProfiler(camera_id_,
                                         aidl_profiler_->GetFpsFlag()));

  std::unique_ptr<google_camera_hal::CameraDeviceSession> session;
  status_t res = google_camera_device_->CreateCameraDeviceSession(&session);
  if (res != OK || session == nullptr) {
    ALOGE("%s: Creating CameraDeviceSession failed: %s(%d)", __FUNCTION__,
          strerror(-res), res);
    return aidl_utils::ConvertToAidlReturn(res);
  }

  auto aidl_session = AidlCameraDeviceSession::Create(
      callback, std::move(session), aidl_profiler_);
  if (aidl_session == nullptr) {
    ALOGE("%s: Creating AidlCameraDeviceSession failed.", __FUNCTION__);
    return aidl_utils::ConvertToAidlReturn(res);
  }
  *session_ret = aidl_session;
  return ScopedAStatus::ok();
}
HidlProviderInfo::getIPCTransport() 返回 IPCTransport::HIDL。
AidlProviderInfo::getIPCTransport() 返回 IPCTransport::AIDL。

选择AIDL Provider还是HIDL Provider,是由相机 HAL 层的实现方式决定的(厂商在开发 HAL 时选择基于 AIDL 还是 HIDL 接口)。框架通过这种方式自动适配不同 HAL 实现,无需上层关心底层细节。
但是更推荐AIDL Provider.
frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp

status_t Camera3Device::initializeCommonLocked(sp<CameraProviderManager> manager) {

    /** Start up status tracker thread */
    mStatusTracker = new StatusTracker(this);
    status_t res = mStatusTracker->run((std::string("C3Dev-") + mId + "-Status").c_str());
    if (res != OK) {
        SET_ERR_L("Unable to start status tracking thread: %s (%d)",
                strerror(-res), res);
        mInterface->close();
        mStatusTracker.clear();
        return res;
    }

    /** Register in-flight map to the status tracker */
    mInFlightStatusId = mStatusTracker->addComponent("InflightRequests");

    /** Create buffer manager */
    mBufferManager = new Camera3BufferManager();

    Vector<int32_t> sessionParamKeys;
    camera_metadata_entry_t sessionKeysEntry = mDeviceInfo.find(
            ANDROID_REQUEST_AVAILABLE_SESSION_KEYS);
    if (sessionKeysEntry.count > 0) {
        sessionParamKeys.insertArrayAt(sessionKeysEntry.data.i32, 0, sessionKeysEntry.count);
    }

    camera_metadata_entry_t availableTestPatternModes = mDeviceInfo.find(
            ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES);
    for (size_t i = 0; i < availableTestPatternModes.count; i++) {
        if (availableTestPatternModes.data.i32[i] ==
                ANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR) {
            mSupportCameraMute = true;
            mSupportTestPatternSolidColor = true;
            break;
        } else if (availableTestPatternModes.data.i32[i] ==
                ANDROID_SENSOR_TEST_PATTERN_MODE_BLACK) {
            mSupportCameraMute = true;
            mSupportTestPatternSolidColor = false;
        }
    }

    camera_metadata_entry_t availableSettingsOverrides = mDeviceInfo.find(
            ANDROID_CONTROL_AVAILABLE_SETTINGS_OVERRIDES);
    for (size_t i = 0; i < availableSettingsOverrides.count; i++) {
        if (availableSettingsOverrides.data.i32[i] ==
                ANDROID_CONTROL_SETTINGS_OVERRIDE_ZOOM) {
            mSupportZoomOverride = true;
            break;
        }
    }

    /** Start up request queue thread */
    mRequestThread = createNewRequestThread(
            this, mStatusTracker, mInterface, sessionParamKeys,
            mUseHalBufManager, mSupportCameraMute, mRotationOverride,
            mSupportZoomOverride);
    res = mRequestThread->run((std::string("C3Dev-") + mId + "-ReqQueue").c_str());
    if (res != OK) {
        SET_ERR_L("Unable to start request queue thread: %s (%d)",
                strerror(-res), res);
        mInterface->close();
        mRequestThread.clear();
        return res;
    }

    setCameraMuteLocked(mCameraMuteInitial);

    mPreparerThread = new PreparerThread();

    internalUpdateStatusLocked(STATUS_UNCONFIGURED);
    mNextStreamId = 0;
    mFakeStreamId = NO_STREAM;
    mNeedConfig = true;
    mPauseStateNotify = false;
    mIsInputStreamMultiResolution = false;

    // Measure the clock domain offset between camera and video/hw_composer
    mTimestampOffset = getMonoToBoottimeOffset();
    camera_metadata_entry timestampSource =
            mDeviceInfo.find(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE);
    if (timestampSource.count > 0 && timestampSource.data.u8[0] ==
            ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_REALTIME) {
        mDeviceTimeBaseIsRealtime = true;
    }

    // Will the HAL be sending in early partial result metadata?
    camera_metadata_entry partialResultsCount =
            mDeviceInfo.find(ANDROID_REQUEST_PARTIAL_RESULT_COUNT);
    if (partialResultsCount.count > 0) {
        mNumPartialResults = partialResultsCount.data.i32[0];
        mUsePartialResult = (mNumPartialResults > 1);
    }

    bool usePrecorrectArray = DistortionMapper::isDistortionSupported(mDeviceInfo);
    if (usePrecorrectArray) {
        res = mDistortionMappers[mId].setupStaticInfo(mDeviceInfo);
        if (res != OK) {
            SET_ERR_L("Unable to read necessary calibration fields for distortion correction");
            return res;
        }
    }

    mZoomRatioMappers[mId] = ZoomRatioMapper(&mDeviceInfo,
            mSupportNativeZoomRatio, usePrecorrectArray);

    if (SessionConfigurationUtils::supportsUltraHighResolutionCapture(mDeviceInfo)) {
        mUHRCropAndMeteringRegionMappers[mId] =
                UHRCropAndMeteringRegionMapper(mDeviceInfo, usePrecorrectArray);
    }

    if (RotateAndCropMapper::isNeeded(&mDeviceInfo)) {
        mRotateAndCropMappers.emplace(mId, &mDeviceInfo);
    }

    // Hidl/AidlCamera3DeviceInjectionMethods
    mInjectionMethods = createCamera3DeviceInjectionMethods(this);

    /** Start watchdog thread */
    mCameraServiceWatchdog = new CameraServiceWatchdog(
            manager->getProviderPids(), mId, mCameraServiceProxyWrapper);
    res = mCameraServiceWatchdog->run("CameraServiceWatchdog");
    if (res != OK) {
        SET_ERR_L("Unable to start camera service watchdog thread: %s (%d)",
                strerror(-res), res);
        return res;
    }

    mSupportsExtensionKeys = areExtensionKeysSupported(mDeviceInfo);

    return OK;
}

CameraManager源码解析

示例代码

package edu.tyut.ffmpeglearn.manager

import android.Manifest
import android.content.Context
import android.graphics.Camera
import android.graphics.ImageFormat
import android.hardware.camera2.CameraCaptureSession
import android.hardware.camera2.CameraCharacteristics
import android.hardware.camera2.CameraDevice
import android.hardware.camera2.CameraManager
import android.hardware.camera2.CaptureFailure
import android.hardware.camera2.CaptureRequest
import android.hardware.camera2.params.OutputConfiguration
import android.hardware.camera2.params.SessionConfiguration
import android.hardware.camera2.params.StreamConfigurationMap
import android.media.Image
import android.media.ImageReader
import android.net.Uri
import android.os.Build
import android.os.Environment
import android.os.Handler
import android.os.HandlerThread
import android.util.Log
import android.util.Range
import android.util.Size
import android.view.Surface
import androidx.annotation.RequiresPermission
import androidx.core.content.FileProvider
import java.io.File
import java.util.concurrent.Executor

private const val TAG: String = "CaptureManager"

/**
 * ffplay -f rawvideo -vf format=yuv420p -video_size 1280x720 -framerate 30 yuv420p.yuv
 */
internal class CaptureManager internal constructor(
    private val context: Context,
) {
    private var lastTimestamp = 0L
    private var frameCount = 0
    private val cameraManager: CameraManager by lazy {
        context.getSystemService<CameraManager>(CameraManager::class.java)
    }

    private val cameraThread = HandlerThread("CameraThread").apply { start() }
    private val cameraHandler = Handler(cameraThread.looper)
    private val executor: Executor = Executor { runnable ->
        cameraHandler.post(runnable)
    }

    private var mCaptureSession: CameraCaptureSession? = null
    private var mCameraDevice: CameraDevice? = null
    private var mImageReader: ImageReader? = null

    private val yuv420pUri: Uri = FileProvider.getUriForFile(
        context, "${context.packageName}.provider", File(
            Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS),
            "yuv420p.yuv"
        )
    )

    private val outputStream = context.contentResolver.openOutputStream(yuv420pUri)


    @RequiresPermission(Manifest.permission.CAMERA)
    // @RequiresApi(Build.VERSION_CODES.VANILLA_ICE_CREAM)
    internal fun open() {
        // val cameraId: String = cameraManager.cameraIdList.firstOrNull() ?: "0"
        val cameraId: String = "0"
        val cameraCharacteristics: CameraCharacteristics =
            cameraManager.getCameraCharacteristics(cameraId)
        val streamConfigurationMap: StreamConfigurationMap? =
            cameraCharacteristics[CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP]
        // LEGACY级别表示设备是通过旧版 camera HAL 模拟的 camera2 API
        val isSupportLegacy: Boolean =
            cameraCharacteristics[CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL] == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY
        val fpsRanges: Array<Range<Int>>? =
            cameraCharacteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES)
        fpsRanges?.forEach {
            Log.i(TAG, "Supported FPS range: ${it.lower} - ${it.upper}")
        }
        val hardwareLevel = cameraCharacteristics.get(
            CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL
        )
        val outputSizes: Array<Size>? =
            streamConfigurationMap?.getOutputSizes(ImageFormat.YUV_420_888)
        Log.i(
            TAG,
            "open -> isSupportLegacy: $isSupportLegacy, hardwareLevel: $hardwareLevel, outputSizes: ${outputSizes?.joinToString()}"
        )
        streamConfigurationMap?.getHighSpeedVideoSizes()?.forEach {
            val ranges = streamConfigurationMap.getHighSpeedVideoFpsRangesFor(it)
            Log.i(TAG, "open -> size: $it, fps: ${ranges.joinToString()}")
        }
        val pixelsSize: Size =
            outputSizes?.firstOrNull { it.width == 1280 && it.height == 720 } ?: Size(
                1280,
                720
            ) // 1440 1080
        Log.i(TAG, "open -> pixelsSize: $pixelsSize")
        val imageReader =
            ImageReader.newInstance(pixelsSize.width, pixelsSize.height, ImageFormat.YUV_420_888, 3)
        this.mImageReader = imageReader

        imageReader.setOnImageAvailableListener({ imageReader: ImageReader? ->
            imageReader?.acquireLatestImage()?.use { image ->
                Log.i(
                    TAG,
                    "open -> Available image width: ${image.width}, height: ${image.height}"
                )
                val byteArray: ByteArray = this@CaptureManager.yuv420ToYuv420p(image)
                outputStream?.write(byteArray)
                outputStream?.flush()
            }
        }, cameraHandler)

        // imageReader.setOnImageAvailableListener({ reader ->
        //     val currentTimestamp = System.currentTimeMillis()
        //     frameCount++
        //     if (lastTimestamp == 0L) {
        //         lastTimestamp = currentTimestamp
        //     } else {
        //         val diff = currentTimestamp - lastTimestamp
        //         if (diff >= 1000) {
        //             val actualFps = frameCount * 1000 / diff
        //             Log.i(TAG, "Actual capture FPS: $actualFps")
        //             frameCount = 0
        //             lastTimestamp = currentTimestamp
        //         }
        //     }
        //     val image = reader.acquireLatestImage()
        //     image?.close()
        // }, cameraHandler)
        
        cameraManager.openCamera(cameraId, object : CameraDevice.StateCallback() {
            override fun onDisconnected(camera: CameraDevice) {
                Log.i(TAG, "onDisconnected...")
            }

            override fun onError(camera: CameraDevice, error: Int) {
                Log.i(TAG, "onError -> error: $error")
            }

            override fun onOpened(camera: CameraDevice) {
                this@CaptureManager.mCameraDevice = camera

                val captureRequestBuilder: CaptureRequest.Builder =
                    camera.createCaptureRequest(if (isSupportLegacy) CameraDevice.TEMPLATE_RECORD else CameraDevice.TEMPLATE_PREVIEW)
                captureRequestBuilder.addTarget(imageReader.surface)
                fpsRanges?.firstOrNull { it.lower == 30 && it.upper == 30 }
                    ?.let { fpsRange: Range<Int> ->
                        Log.i(TAG, "onOpened -> fpsRange: $fpsRange")
                        captureRequestBuilder.set<Range<Int>>(
                            CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE,
                            fpsRange // 这里示例设置为 24fps
                        )
                    }
                captureRequestBuilder.set(
                    CaptureRequest.CONTROL_AE_MODE,
                    CaptureRequest.CONTROL_AE_MODE_ON
                )
                createSession(
                    camera = camera,
                    imageReader = imageReader,
                    captureRequestBuilder = captureRequestBuilder
                )
            }

            override fun onClosed(camera: CameraDevice) {
                super.onClosed(camera)
                Log.i(TAG, "onClosed...")
                // cameraHandler.post {
                //     cameraHandler.removeCallbacksAndMessages(null)
                //     val quitSafely: Boolean = cameraThread.quitSafely()
                //     Log.i(TAG, "release -> quitSafely: $quitSafely")
                // }
            }
        }, cameraHandler)
    }

    // @RequiresApi(Build.VERSION_CODES.VANILLA_ICE_CREAM)
    private fun createSession(
        camera: CameraDevice,
        imageReader: ImageReader,
        captureRequestBuilder: CaptureRequest.Builder
    ) {
        val outputConfiguration = OutputConfiguration(imageReader.surface)
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.P) {
            val configs = mutableListOf<OutputConfiguration>()
            val sessionConfiguration = SessionConfiguration(
                SessionConfiguration.SESSION_REGULAR,
                listOf<OutputConfiguration>(outputConfiguration),
                executor,
                object : CameraCaptureSession.StateCallback() {
                    override fun onConfigureFailed(session: CameraCaptureSession) {
                        Log.i(TAG, "onConfigureFailed...")
                    }

                    override fun onConfigured(session: CameraCaptureSession) {
                        this@CaptureManager.mCaptureSession = session
                        session.setRepeatingRequest(
                            captureRequestBuilder.build(),
                            object : CameraCaptureSession.CaptureCallback() {
                                override fun onCaptureFailed(
                                    session: CameraCaptureSession,
                                    request: CaptureRequest,
                                    failure: CaptureFailure
                                ) {
                                    Log.i(TAG, "onCaptureFailed -> failure...")
                                }
                            },
                            cameraHandler
                        )
                    }
                })
            camera.createCaptureSession(sessionConfiguration)
        } else {
            @Suppress("DEPRECATION")
            camera.createCaptureSession(
                listOf<Surface>(imageReader.surface),
                object : CameraCaptureSession.StateCallback() {
                    override fun onConfigureFailed(session: CameraCaptureSession) {
                        Log.i(TAG, "onConfigureFailed...")
                    }

                    override fun onConfigured(session: CameraCaptureSession) {
                        this@CaptureManager.mCaptureSession = session
                        session.setRepeatingRequest(
                            captureRequestBuilder.build(),
                            object : CameraCaptureSession.CaptureCallback() {
                                override fun onCaptureFailed(
                                    session: CameraCaptureSession,
                                    request: CaptureRequest,
                                    failure: CaptureFailure
                                ) {
                                    Log.i(TAG, "onCaptureFailed -> failure...")
                                }
                            },
                            cameraHandler
                        )
                    }
                },
                cameraHandler
            )
        }
    }


    fun yuv420ToNv21(image: Image): ByteArray {
        val width = image.width
        val height = image.height
        val ySize = width * height
        val uvSize = width * height / 2
        val out = ByteArray(ySize + uvSize)

        val yPlane = image.planes[0]
        val uPlane = image.planes[1]
        val vPlane = image.planes[2]

        // 拷贝 Y
        var pos = 0
        for (row in 0 until height) {
            yPlane.buffer.position(row * yPlane.rowStride)
            yPlane.buffer.get(out, pos, width)
            pos += width
        }

        // 拷贝 UV (NV21)
        for (row in 0 until height / 2) {
            for (col in 0 until width / 2) {
                val uIndex = row * uPlane.rowStride + col * uPlane.pixelStride
                val vIndex = row * vPlane.rowStride + col * vPlane.pixelStride
                out[pos++] = vPlane.buffer.get(vIndex) // V
                out[pos++] = uPlane.buffer.get(uIndex) // U
            }
        }

        return out
    }

    internal fun release() {
        // 1. 停止重复请求

        try {
            mCaptureSession?.stopRepeating()
            mCaptureSession?.abortCaptures()

            mCaptureSession?.close()
            mImageReader?.close()
            mCameraDevice?.close()

            mImageReader?.setOnImageAvailableListener(null, null)

            outputStream?.close()

            mCaptureSession = null
            mImageReader = null
            mCameraDevice = null

            cameraHandler.removeCallbacksAndMessages(null)
            val quitSafely: Boolean = cameraThread.quitSafely()
            Log.i(TAG, "release -> quitSafely: $quitSafely")
            cameraThread.join()

        } catch (e: Exception) {
            Log.e(TAG, "release -> error: ${e.message}", e)
        }
    }

    fun yuv420ToYuv420p(image: Image): ByteArray {

        val width = image.width
        val height = image.height

        val ySize = width * height

        val uvSize = width * height / 4  // 每个 U/V 平面大小 = width/2 * height/2

        val out = ByteArray(ySize + uvSize * 2)

        val yPlane = image.planes[0]
        val uPlane = image.planes[1]
        val vPlane = image.planes[2]

        var pos = 0

        // 拷贝 Y 平面
        val yBuffer = yPlane.buffer
        for (row in 0 until height) {
            yBuffer.position(row * yPlane.rowStride)
            yBuffer.get(out, pos, width)
            pos += width
        }

        // 拷贝 U 平面
        val uBuffer = uPlane.buffer
        for (row in 0 until height / 2) {
            for (col in 0 until width / 2) {
                val index = row * uPlane.rowStride + col * uPlane.pixelStride
                out[pos++] = uBuffer.get(index)
            }
        }

        // 拷贝 V 平面
        val vBuffer = vPlane.buffer
        for (row in 0 until height / 2) {
            for (col in 0 until width / 2) {
                val index = row * vPlane.rowStride + col * vPlane.pixelStride
                out[pos++] = vBuffer.get(index)
            }
        }

        return out
    }
}

IPC调用connectDevice函数
frameworks/av/services/camera/libcameraservice/CameraService.cpp

Status CameraService::connectDevice(
        const sp<hardware::camera2::ICameraDeviceCallbacks>& cameraCb,
        const std::string& unresolvedCameraId,
        int oomScoreOffset, int targetSdkVersion,
        int rotationOverride, const AttributionSourceState& clientAttribution, int32_t devicePolicy,
        bool sharedMode,
        /*out*/sp<hardware::camera2::ICameraDeviceUser>* device) {
    return connectDeviceImpl(cameraCb, unresolvedCameraId, oomScoreOffset, targetSdkVersion,
            rotationOverride, clientAttribution, devicePolicy, sharedMode,
            /*isVendorClient*/false, device);
}
posted @ 2025-10-25 19:05  爱情丶眨眼而去  阅读(4)  评论(0)    收藏  举报