概述
认识一个类,相当于结交一位朋友;看一篇源码,相当于一次顶级的会话;
读一个框架,相当于见证一段思想;做一个程序,相当于创造一个生命;
一次Git提交,相当于记录一次成长;生活也许并非那么美好,但一切可以这么崇高。----张风捷特烈
一、关于SurfaceView
对于视频、相机、游戏、Flutter等需要高性能渲染的场景,你都会发现SurfaceView的身影,如果你想进行高性能的渲染,那么SurfaceView是你必须要过的坎,也是一把打开视频之门的钥匙。 本篇会从一下几点的极简操作,来让你对SurfaceView有个感性的认知:
[1].Camera的预览和SurfaceView的使用
[2].Camera2的预览和SurfaceView的使用
[3].OpenGL中的GLSurfaceView
[4].Camera2和OpenGL的结合
[5].视频播放和和OpenGL的结合
[6].Flutter与SurfaceView的联系
复制代码
1.Camera使用SurfaceView开启预览
SurfaceView依赖SurfaceHolder类,所以两者形影不离。Camera的setPreviewDisplay方法入参是一个SurfaceHolder
SurfaceHolder并不是立马就创建出来的,需要一个回调监听。以便对它创建、改变、销毁时的感知并进行相关操作。
该监听的接口为SurfaceHolder.Callback
,为了方便,可直接实现之。当然你也可以新建一个类
详细操作见:Android多媒体之Camera的相关操作
public class CameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback {
private Camera camera;
public CameraSurfaceView(Context context) {
this(context,null);
}
public CameraSurfaceView(Context context, AttributeSet attrs) {
this(context, attrs,0);
}
public CameraSurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
getHolder().addCallback(this);//为SurfaceView的SurfaceHolder添加回调
}
//-----------------覆写SurfaceHolder.Callback方法----------------------
@Override
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
camera.setDisplayOrientation(90);
try {
camera.setPreviewDisplay(holder);//Camera+SurfaceHolder
camera.startPreview();
} catch (IOException e) {
e.printStackTrace();
}
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
camera.release();//释放资源
}
}
复制代码
2.Camera2中SurfaceView使用
Camera2并不是值Camera2类,而是camera2包下的相机系统,虽然使用起来挺复杂
但简单必有简单的局限,复杂必有复杂的价值
,它的显示核心也需要一个SurfaceHolder
详细操作见:Android多媒体之Camera2的相关操作
public class Camera2SurfaceView extends SurfaceView implements SurfaceHolder.Callback {
private Handler mainHandler;
private String mCameraID;
private CameraManager mCameraManager;
private CameraDevice mCameraDevice;//相机设备
private CameraCaptureSession mCameraCaptureSession;
private Handler childHandler;
private CameraDevice.StateCallback mStateCallback;
private Semaphore mCameraOpenCloseLock = new Semaphore(1);//以防止在关闭相机之前应用程序退出
public Camera2SurfaceView(Context context) {
this(context,null);
}
public Camera2SurfaceView(Context context, AttributeSet attrs) {
this(context, attrs,0);
}
public Camera2SurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
getHolder().addCallback(this);//为SurfaceView的SurfaceHolder添加回调
}
//-----------------覆写SurfaceHolder.Callback方法----------------------
@Override
public void surfaceCreated(SurfaceHolder holder) {
initHandler();//初始化线程处理器
initCamera();//初始化相机
try {
if (ActivityCompat.checkSelfPermission(getContext(), Manifest.permission.CAMERA) !=
PackageManager.PERMISSION_GRANTED) {
return;
}
mCameraManager.openCamera(mCameraID, mStateCallback, mainHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCameraDevice.close();//释放资源;//释放资源
}
private void initCamera() {
mCameraID = "" + CameraCharacteristics.LENS_FACING_FRONT;//后摄像头
//获取摄像头管理器
mCameraManager = (CameraManager) getContext().getSystemService(Context.CAMERA_SERVICE);
mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice camera) {
mCameraOpenCloseLock.release();
mCameraDevice = camera;
startPreview();
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
mCameraOpenCloseLock.release();
mCameraDevice.close();
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
mCameraOpenCloseLock.release();
mCameraDevice.close();
}
};
}
private void initHandler() {
HandlerThread handlerThread = new HandlerThread("Camera2");
handlerThread.start();
mainHandler = new Handler(getMainLooper());//主线程Handler
childHandler = new Handler(handlerThread.getLooper());//子线程Handler
}
/**
* 开启预览
*/
private void startPreview() {
try {
// 创建预览需要的CaptureRequest.Builder
final CaptureRequest.Builder reqBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
// 将SurfaceView的surface作为CaptureRequest.Builder的目标
reqBuilder.addTarget(getHolder().getSurface());
// 创建CameraCaptureSession,该对象负责管理处理预览请求和拍照请求
CameraCaptureSession.StateCallback stateCallback = new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
if (null == mCameraDevice) return;
mCameraCaptureSession = cameraCaptureSession; // 当摄像头已经准备好时,开始显示预览
try {// 显示预览
mCameraCaptureSession.setRepeatingRequest(reqBuilder.build(), null, childHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
}
};
mCameraDevice.createCaptureSession(Collections.singletonList(getHolder().getSurface()),
stateCallback, childHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
}
复制代码
3.OpenGL中GLSurfaceView使用
GLSurfaceView作为SurfaceView的子类,打开了一扇叫作OpenGL的大门。
它实现了SurfaceHolder.Callback2
接口,需要传入一个GLSurfaceView.Render
接口
public class TriangleGLView extends GLSurfaceView implements GLSurfaceView.Renderer {
private Triangle mTriangle;
public TriangleGLView(Context context) {
this(context, null);
}
public TriangleGLView(Context context, AttributeSet attrs) {
super(context, attrs);
setEGLContextClientVersion(2);//设置OpenGL ES 2.0 context
setRenderer(this);//设置渲染器
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
mTriangle=new Triangle();
GLES20.glClearColor(1.0f, 0.0f, 0.0f, 1.0f);//rgba
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);//GL视口
}
@Override
public void onDrawFrame(GL10 gl) {
//清除颜色缓存和深度缓存
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT| GLES20.GL_DEPTH_BUFFER_BIT);
mTriangle.draw();
}
}
复制代码
OpenGL的绘制可谓让人望而却步,下面是最简单的三角形绘制,
如果有兴趣可以看一下笔者OpenGL相关文章,仔细看完,基本上可以入门
Android多媒体之GL-ES战记第一集--勇者集结
Android多媒体之GL-ES战记第二集--谜团立方
Android多媒体之GLES2战记第三集--圣火之光
Android多媒体之GLES2战记第四集--移形换影
Android多媒体之GLES2战记第五集--宇宙之光
Android多媒体之GLES2战记第六集--九层之台
public class Triangle {
private FloatBuffer vertexBuffer;//顶点缓冲
private final String vertexShaderCode =//顶点着色代码
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition;" +
"}";
private final String fragmentShaderCode =//片元着色代码
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
private final int mProgram;
private int mPositionHandle;//位置句柄
private int mColorHandle;//颜色句柄
private final int vertexCount = sCoo.length / COORDS_PER_VERTEX;//顶点个数
private final int vertexStride = COORDS_PER_VERTEX * 4; // 3*4=12
// 数组中每个顶点的坐标数
static final int COORDS_PER_VERTEX = 3;
static float sCoo[] = { //以逆时针顺序
0.0f, 0.0f, 0.0f, // 顶部
-1.0f, -1.0f, 0.0f, // 左下
1.0f, -1.0f, 0.0f // 右下
};
// 颜色,rgba
float color[] = {0.63671875f, 0.76953125f, 0.22265625f, 1.0f};
public Triangle() {
//初始化顶点字节缓冲区
ByteBuffer bb = ByteBuffer.allocateDirect(sCoo.length * 4);//每个浮点数:坐标个数* 4字节
bb.order(ByteOrder.nativeOrder());//使用本机硬件设备的字节顺序
vertexBuffer = bb.asFloatBuffer();// 从字节缓冲区创建浮点缓冲区
vertexBuffer.put(sCoo);// 将坐标添加到FloatBuffer
vertexBuffer.position(0);//设置缓冲区以读取第一个坐标
int vertexShader = loadShader(
GLES20.GL_VERTEX_SHADER,//顶点着色
vertexShaderCode);
int fragmentShader = loadShader
(GLES20.GL_FRAGMENT_SHADER,//片元着色
fragmentShaderCode);
mProgram = GLES20.glCreateProgram();//创建空的OpenGL ES 程序
GLES20.glAttachShader(mProgram, vertexShader);//加入顶点着色器
GLES20.glAttachShader(mProgram, fragmentShader);//加入片元着色器
GLES20.glLinkProgram(mProgram);//创建可执行的OpenGL ES项目
}
private int loadShader(int type, String shaderCode) {
int shader = GLES20.glCreateShader(type);//创建着色器
GLES20.glShaderSource(shader, shaderCode);//添加着色器源代码
GLES20.glCompileShader(shader);//编译
return shader;
}
public void draw() {
// 将程序添加到OpenGL ES环境中
GLES20.glUseProgram(mProgram);
//获取顶点着色器的vPosition成员的句柄
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
//启用三角形顶点的句柄
GLES20.glEnableVertexAttribArray(mPositionHandle);
//准备三角坐标数据
GLES20.glVertexAttribPointer(
mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// 获取片元着色器的vColor成员的句柄
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
//为三角形设置颜色
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
//绘制三角形
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
//禁用顶点数组
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
复制代码
4.OpenGL在相机中的使用
现在捋一下,相机需要一个SurfaceHolder,而GLSurfaceView是一个SurfaceView,郎情妾意。 但好事多磨,并没有想象中的这么简单...
在CameraGLView主类中创建SurfaceTexture对象,并将纹理绑定其上
而通过SurfaceTexture作为入参可以创建SurfaceHolder,一条路就通了。
public class CameraGLView extends GLSurfaceView implements GLSurfaceView.Renderer {
private CameraDrawer cameraDrawer;
public SurfaceTexture surfaceTexture;
private int[] textureId = new int[1];
//----------------------------相机操作------------------------------
private Handler mainHandler;
private Handler childHandler;
private String mCameraID;
private CameraManager mCameraManager;
private CameraDevice mCameraDevice;//相机设备
private CameraCaptureSession mCameraCaptureSession;
private CameraDevice.StateCallback mStateCallback;
private Size mVideoSize;
private Semaphore mCameraOpenCloseLock = new Semaphore(1);//以防止在关闭相机之前应用程序退出
private Surface surface;
public CameraGLView(Context context) {
this(context,null);
}
public CameraGLView(Context context, AttributeSet attrs) {
super(context, attrs);
setEGLContextClientVersion(3);//设置OpenGL ES 3.0 context
setRenderer(this);//设置渲染器
}
private void initHandler() {
HandlerThread handlerThread = new HandlerThread("Camera2");
handlerThread.start();
mainHandler = new Handler(getMainLooper());//主线程Handler
childHandler = new Handler(handlerThread.getLooper());//子线程Handler
}
private void initCamera() {
mCameraID = "" + CameraCharacteristics.LENS_FACING_FRONT;//后摄像头
//获取摄像头管理器
mCameraManager = (CameraManager) getContext().getSystemService(Context.CAMERA_SERVICE);
mVideoSize=getCameraOutputSizes(mCameraManager,mCameraID,SurfaceTexture.class).get(0);
mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice camera) {
mCameraOpenCloseLock.release();
mCameraDevice = camera;
startPreview();
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
mCameraOpenCloseLock.release();
mCameraDevice.close();
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
mCameraOpenCloseLock.release();
mCameraDevice.close();
}
};
}
/**
* 根据输出类获取指定相机的输出尺寸列表,降序排序
*/
public List<Size> getCameraOutputSizes(CameraManager cameraManager, String cameraId, Class clz){
try {
CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
List<Size> sizes = Arrays.asList(configs.getOutputSizes(clz));
Collections.sort(sizes, (o1, o2) -> o1.getWidth() * o1.getHeight() - o2.getWidth() * o2.getHeight());
Collections.reverse(sizes);
return sizes;
} catch (CameraAccessException e) {
e.printStackTrace();
}
return null;
}
/**
* 开启预览
*/
private void startPreview() {
surfaceTexture.setDefaultBufferSize(mVideoSize.getWidth(), mVideoSize.getHeight());
surfaceTexture.setOnFrameAvailableListener(surfaceTexture -> requestRender());
surface = new Surface(surfaceTexture);
try {
// 创建预览需要的CaptureRequest.Builder
final CaptureRequest.Builder reqBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
// 将SurfaceView的surface作为CaptureRequest.Builder的目标
reqBuilder.addTarget(surface);
// 创建CameraCaptureSession,该对象负责管理处理预览请求和拍照请求
CameraCaptureSession.StateCallback stateCallback = new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
if (null == mCameraDevice) return;
mCameraCaptureSession = cameraCaptureSession; // 当摄像头已经准备好时,开始显示预览
try {// 显示预览
mCameraCaptureSession.setRepeatingRequest(reqBuilder.build(), null, childHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
}
};
mCameraDevice.createCaptureSession(Collections.singletonList(surface), stateCallback, childHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
cameraDrawer=new CameraDrawer(getContext());
//创建纹理对象
GLES30.glGenTextures(textureId.length, textureId, 0);
//将纹理对象绑定到srufaceTexture
surfaceTexture = new SurfaceTexture(textureId[0]); //创建并连接程序
initHandler();//初始化线程处理器
initCamera();//初始化相机
try {
if (ActivityCompat.checkSelfPermission(getContext(), Manifest.permission.CAMERA) !=
PackageManager.PERMISSION_GRANTED) {
return;
}
mCameraManager.openCamera(mCameraID, mStateCallback, mainHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
glViewport(0,0,width,height);
}
@Override
public void onDrawFrame(GL10 gl) {
surfaceTexture.updateTexImage();
GLES30.glClear(GLES30.GL_COLOR_BUFFER_BIT);
cameraDrawer.draw(textureId[0]);
}
}
复制代码
通过CameraDrawer类绘制纹理,这就跟画三角形非常类似,通过着色器(shader)进行着色
fragment片元:camera.fsh
#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision mediump float;
in vec2 vTexCoord;
out vec4 outColor;
uniform samplerExternalOES sTexture;
void main(){
outColor = texture(sTexture, vTexCoord);
}
复制代码
vertex顶元:camera.vsh
#version 300 es
in vec4 aPosition;
in vec2 aTexCoord;
out vec2 vTexCoord;
void main(){
gl_Position = aPosition;
vTexCoord = aTexCoord;
}
复制代码
绘画器:
CameraDrawer
public class CameraDrawer {
private static final String VERTEX_ATTRIB_POSITION = "a_Position";
private static final int VERTEX_ATTRIB_POSITION_SIZE = 3;
private static final String VERTEX_ATTRIB_TEXTURE_POSITION = "a_texCoord";
private static final int VERTEX_ATTRIB_TEXTURE_POSITION_SIZE = 2;
private static final String UNIFORM_TEXTURE = "s_texture";
private float[] vertex ={
-1f,1f,0.0f,//左上
-1f,-1f,0.0f,//左下
1f,-1f,0.0f,//右下
1f,1f,0.0f//右上
};
//纹理坐标,(s,t),t坐标方向和顶点y坐标反着
public float[] textureCoord = {
0.0f,1.0f,
1.0f,1.0f,
1.0f,0.0f,
0.0f,0.0f
};
private FloatBuffer vertexBuffer;
private FloatBuffer textureCoordBuffer;
private int program;
private Context context;
public CameraDrawer(Context context) {
this.context = context;
initVertexAttrib(); //初始化顶点数据
program = GLUtil.loadAndInitProgram( this.context,"camera.vsh","camera.fsh");
GLES30.glClearColor(1.0f, 1.0f, 1.0f, 0.0f);
}
private void initVertexAttrib() {
textureCoordBuffer = GLUtil.getFloatBuffer(textureCoord);
vertexBuffer = GLUtil.getFloatBuffer(vertex);
}
public void draw(int textureId){
GLES30.glUseProgram(program);
//初始化句柄
int vertexLoc = GLES30.glGetAttribLocation(program, VERTEX_ATTRIB_POSITION);
int textureLoc = GLES30.glGetAttribLocation(program, VERTEX_ATTRIB_TEXTURE_POSITION);
GLES30.glEnableVertexAttribArray(vertexLoc);
GLES30.glEnableVertexAttribArray(textureLoc);
GLES30.glVertexAttribPointer(vertexLoc,
VERTEX_ATTRIB_POSITION_SIZE,
GLES30.GL_FLOAT,
false,
0,
vertexBuffer);
GLES30.glVertexAttribPointer(textureLoc,
VERTEX_ATTRIB_TEXTURE_POSITION_SIZE,
GLES30.GL_FLOAT,
false,
0,
textureCoordBuffer);
//纹理绑定
GLES30.glActiveTexture( GLES30.GL_TEXTURE0);
GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_NEAREST);
GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR);
GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE);
GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE);
int uTextureLoc = GLES30.glGetUniformLocation(program, UNIFORM_TEXTURE);
GLES30.glUniform1i(uTextureLoc,0);
//绘制
GLES30.glDrawArrays( GLES30.GL_TRIANGLE_FAN,0,vertex.length / 3);
//禁用顶点
GLES30.glDisableVertexAttribArray(vertexLoc);
GLES30.glDisableVertexAttribArray(textureLoc);
}
}
复制代码
也许你并不了解OpenGL,看到结果会大呼:
TM,这么麻烦,才实先预览?拜拜,学不动,告辞。
对,很麻烦,之后还会更麻烦。但你不会,别人会。你怕麻烦,别人去钻研,这就是人与人的差距。
我最不能理解的是怕麻烦的人到处询问学习方法。只要你不怕麻烦,遇到问题肯去钻,去看源码,去debug,还有什么能阻挡你。世事有难易乎,为之则难者易,不为则易者难。
OpenGL打开了一扇大门,根据shader可以进行非常多的操作,滤镜,贴图,着色,变换...甚至可以说
给我一个shader的用武之处,我能给你创造一个世界
5.OpenGL在视频播放中的使用
如果你稍微了解一下视频播放,会知道MediaPlayer可以和Surface
狼狈为奸
于是乎,同理,可以将视频播放和OpenGL结合,然后通过shader来逆天改命
这里思路几乎一致GLVideoView中进行SurfaceTexture和纹理绑定,并生成Surface给MediaPlayer
关于MediaPlayer的视频播放,详见:Android多媒体之视频播放器(基于MediaPlayer)
public class GLVideoView extends GLSurfaceView implements GLSurfaceView.Renderer,
SurfaceTexture.OnFrameAvailableListener, MediaPlayer.OnVideoSizeChangedListener {
private float[] sTMatrix = new float[16];
private final float[] projectionMatrix=new float[16];
private SurfaceTexture surfaceTexture;
private MediaPlayer mediaPlayer;
private VideoDrawer videoDrawer;
private int textureId;
private boolean updateSurface;
private boolean playerPrepared;
private int screenWidth,screenHeight;
public GLVideoView(Context context) {
super(context);
}
public GLVideoView(Context context, AttributeSet attrs) {
super(context, attrs);
setEGLContextClientVersion(2);//设置OpenGL ES 3.0 context
setRenderer(this);//设置渲染器
initPlayer();
}
private void initPlayer() {
mediaPlayer=new MediaPlayer();
try{
mediaPlayer.setDataSource(getContext(), Uri.parse("/sdcard/toly/sh.mp4"));
}catch (IOException e){
e.printStackTrace();
}
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setLooping(true);
mediaPlayer.setOnVideoSizeChangedListener(this);
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
videoDrawer=new VideoDrawer(getContext());
playerPrepared=false;
synchronized(this) {
updateSurface = false;
}
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
textureId = textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
surfaceTexture = new SurfaceTexture(textureId);
surfaceTexture.setOnFrameAvailableListener(this);
Surface surface = new Surface(surfaceTexture);
mediaPlayer.setSurface(surface);
surface.release();
if (!playerPrepared){
try {
mediaPlayer.prepare();
playerPrepared=true;
} catch (IOException t) {
}
mediaPlayer.start();
playerPrepared=true;
}
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
screenWidth=width; screenHeight=height;
GLES20.glViewport(0,0,screenWidth,screenHeight);
}
@Override
public void onDrawFrame(GL10 gl) {
synchronized (this){
if (updateSurface){
surfaceTexture.updateTexImage();
surfaceTexture.getTransformMatrix(sTMatrix);
updateSurface = false;
}
}
videoDrawer.draw(textureId,projectionMatrix, sTMatrix);
}
@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
updateSurface = true;
}
@Override
public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
screenWidth=width; screenHeight=height;
updateProjection(width,height);
}
private void updateProjection(int videoWidth, int videoHeight){
float screenRatio=(float)screenWidth/screenHeight;
float videoRatio=(float)videoWidth/videoHeight;
if (videoRatio>screenRatio){
Matrix.orthoM(projectionMatrix,0,
-1f,1f,-videoRatio/screenRatio,videoRatio/screenRatio,
-1f,1f);
}else {
Matrix.orthoM(projectionMatrix,0,
-screenRatio/videoRatio,screenRatio/videoRatio,-1f,1f,
-1f,1f);
}
}
}
复制代码
着色器
---->[video.fsh]----
#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 vTexCoord;
uniform samplerExternalOES sTexture;
void main() {
vec3 color = texture2D(sTexture, vTexCoord).rgb;
float threshold = 0.8;//阈值
float mean = (color.r + color.g + color.b) / 3.0;
color.r = color.g = color.b = mean >= threshold ? 1.0 : 0.0;
gl_FragColor = vec4(1,color);//固定红色
}
---->[video.vsh]----
attribute vec4 aPosition;//顶点位置
attribute vec4 aTexCoord;//纹理坐标
varying vec2 vTexCoord;
uniform mat4 uMatrix;
uniform mat4 uSTMatrix;
void main() {
vTexCoord = (uSTMatrix * aTexCoord).xy;
gl_Position = uMatrix*aPosition;
}
复制代码
再通过
VideoDrawer
进行着色处理和绘制
public class VideoDrawer {
private Context context;
private int aPositionLocation;
private int programId;
private FloatBuffer vertexBuffer;
private final float[] vertexData = {
1f, -1f, 0f,
-1f, -1f, 0f,
1f, 1f, 0f,
-1f, 1f, 0f
};
private int uMatrixLocation;
private final float[] textureVertexData = {
1f, 0f,
0f, 0f,
1f, 1f,
0f, 1f
};
private FloatBuffer textureVertexBuffer;
private int uTextureSamplerLocation;
private int aTextureCoordLocation;
private int uSTMMatrixHandle;
public VideoDrawer(Context context) {
this.context = context;
vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4)
.order(ByteOrder.nativeOrder())
.asFloatBuffer()
.put(vertexData);
vertexBuffer.position(0);
textureVertexBuffer = ByteBuffer.allocateDirect(textureVertexData.length * 4)
.order(ByteOrder.nativeOrder())
.asFloatBuffer()
.put(textureVertexData);
textureVertexBuffer.position(0);
programId = GLUtil.loadAndInitProgram(context, "video.vsh", "video.fsh");
aPositionLocation = GLES20.glGetAttribLocation(programId, "aPosition");
uMatrixLocation = GLES20.glGetUniformLocation(programId, "uMatrix");
uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");
uTextureSamplerLocation = GLES20.glGetUniformLocation(programId, "sTexture");
aTextureCoordLocation = GLES20.glGetAttribLocation(programId, "aTexCoord");
}
public void draw(int textureId, float[] projectionMatrix, float[] sTMatrix) {
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glUseProgram(programId);
GLES20.glUniformMatrix4fv(uMatrixLocation, 1, false, projectionMatrix, 0);
GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, sTMatrix, 0);
vertexBuffer.position(0);
GLES20.glEnableVertexAttribArray(aPositionLocation);
GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,
12, vertexBuffer);
textureVertexBuffer.position(0);
GLES20.glEnableVertexAttribArray(aTextureCoordLocation);
GLES20.glVertexAttribPointer(aTextureCoordLocation, 2, GLES20.GL_FLOAT, false, 8, textureVertexBuffer);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
GLES20.glUniform1i(uTextureSamplerLocation, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
}
复制代码
6.Flutter 与 SurfaceView
如果你对Flutter的实现有所了解,那么对FlutterView应该并不陌生。对于Android端,
Flutter所有视图都在FlutterView中进行绘制,而FlutterView便是继承自SurfaceView
这也足以显示SurfaceView是多么强大
public class FlutterView extends SurfaceView
implements BinaryMessenger, TextureRegistry {
复制代码
既然是SurfaceView,那么自然少不了前面的那些形式,回调啦,SurfaceHolder什么的
在成员属性中有mSurfaceCallback和nextTextureId,是不是很亲切
在构造方法中mSurfaceCallback被直接创建,surfaceCreated、surfaceChanged、surfaceDestroyed
public class FlutterView extends SurfaceView implements BinaryMessenger, TextureRegistry {
private static final String TAG = "FlutterView";
//...
private final Callback mSurfaceCallback;
//...
private final AtomicLong nextTextureId;
//构造方法中
this.mSurfaceCallback = new Callback() {
public void surfaceCreated(SurfaceHolder holder) {
FlutterView.this.assertAttached();
FlutterView.this.mNativeView.getFlutterJNI().onSurfaceCreated(holder.getSurface());
}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
FlutterView.this.assertAttached();
FlutterView.this.mNativeView.getFlutterJNI().onSurfaceChanged(width, height);
}
public void surfaceDestroyed(SurfaceHolder holder) {
FlutterView.this.assertAttached();
FlutterView.this.mNativeView.getFlutterJNI().onSurfaceDestroyed();
}
};
this.getHolder().addCallback(this.mSurfaceCallback);
复制代码
另外在detach和destroy时mSurfaceCallback都会被移除
public FlutterNativeView detach() {
if (!this.isAttached()) {
return null;
} else {
this.getHolder().removeCallback(this.mSurfaceCallback);
this.mNativeView.detachFromFlutterView();
FlutterNativeView view = this.mNativeView;
this.mNativeView = null;
return view;
}
}
public void destroy() {
if (this.isAttached()) {
this.getHolder().removeCallback(this.mSurfaceCallback);
this.mNativeView.destroy();
this.mNativeView = null;
}
}
复制代码
- 关于SurfaceTexture的实例对象
使用内部类
SurfaceTextureRegistryEntry
进行构建,setOnFrameAvailableListener和在上面也出现过
所以凡事混个脸熟也挺有价值的,至少当你见到它知道它在干嘛。
public SurfaceTextureEntry createSurfaceTexture() {
SurfaceTexture surfaceTexture = new SurfaceTexture(0);
surfaceTexture.detachFromGLContext();
FlutterView.SurfaceTextureRegistryEntry entry = new FlutterView.SurfaceTextureRegistryEntry(this.nextTextureId.getAndIncrement(), surfaceTexture);
this.mNativeView.getFlutterJNI().registerTexture(entry.id(), surfaceTexture);
return entry;
}
final class SurfaceTextureRegistryEntry implements SurfaceTextureEntry {
private final long id;
private final SurfaceTexture surfaceTexture;
private boolean released;
private OnFrameAvailableListener onFrameListener = new OnFrameAvailableListener() {
public void onFrameAvailable(SurfaceTexture texture) {
if (!SurfaceTextureRegistryEntry.this.released && FlutterView.this.mNativeView != null) {
FlutterView.this.mNativeView.getFlutterJNI().markTextureFrameAvailable(SurfaceTextureRegistryEntry.this.id);
}
}
};
SurfaceTextureRegistryEntry(long id, SurfaceTexture surfaceTexture) {
this.id = id;
this.surfaceTexture = surfaceTexture;
if (VERSION.SDK_INT >= 21) {
this.surfaceTexture.setOnFrameAvailableListener(this.onFrameListener, new Handler());
} else {
this.surfaceTexture.setOnFrameAvailableListener(this.onFrameListener);
}
}
public SurfaceTexture surfaceTexture() {
return this.surfaceTexture;
}
public long id() {
return this.id;
}
复制代码
前面也知道surfaceTexture最重要的是纹理的绑定,在FlutterJNI的nativeRegisterTexture方法中进行实现。
---->[io.flutter.embedding.engine.FlutterJNI#registerTexture]----
@UiThread
public void registerTexture(long textureId, @NonNull SurfaceTexture surfaceTexture) {
this.ensureRunningOnMainThread();
this.ensureAttachedToNative();
this.nativeRegisterTexture(this.nativePlatformViewId, textureId, surfaceTexture);
}
---->[io.flutter.embedding.engine.FlutterJNI#nativeRegisterTexture]----
private native void nativeRegisterTexture(long var1, long var3, @NonNull SurfaceTexture var5);
复制代码
放在以前,到这里我就弃了,不过现在,可以稍稍追一下,首先要明白,nativeRegisterTexture的C++实现的方法在哪
如果想要查看关于FlutterJNI的C++代码,需要下载flutter engine,GitHub地址:
位置:engine-master/shell/platform/android/platform_view_android_jni.h
static void RegisterTexture(JNIEnv* env,
jobject jcaller,
jlong shell_holder,
jlong texture_id,
jobject surface_texture) {
ANDROID_SHELL_HOLDER->GetPlatformView()->RegisterExternalTexture(
static_cast<int64_t>(texture_id), //
fml::jni::JavaObjectWeakGlobalRef(env, surface_texture) //
);
}
bool RegisterApi(JNIEnv* env) {
static const JNINativeMethod flutter_jni_methods[] = {
{
.name = "nativeRegisterTexture",
.signature = "(JJLandroid/graphics/SurfaceTexture;)V",
.fnPtr = reinterpret_cast<void*>(&RegisterTexture),
},
//...
if (env->RegisterNatives(g_flutter_jni_class->obj(), flutter_jni_methods,
fml::size(flutter_jni_methods)) != 0) {
FML_LOG(ERROR) << "Failed to RegisterNatives with FlutterJNI";
return false;
}
复制代码
一不小心又学会了一种JNI方法的注册方式...这波不亏。什么是好的学习方法。多看,多想,知识和你不期而遇
bool PlatformViewAndroid::Register(JNIEnv* env) {
if (env == nullptr) {
FML_LOG(ERROR) << "No JNIEnv provided";
return false;
}
//...
//可见这里通过FindClass指向了FlutterJNI的Java类,也就是刚才我们看的。
g_flutter_jni_class = new fml::jni::ScopedJavaGlobalRef<jclass>(
env, env->FindClass("io/flutter/embedding/engine/FlutterJNI"));
if (g_flutter_jni_class->is_null()) {
FML_LOG(ERROR) << "Failed to find FlutterJNI Class.";
return false;
}
//...
return RegisterApi(env);
}
复制代码
点到为止,就不继续挖了。以后有机会专门开坑来挖一篇。到这里你应该对SurfaceView有了一个感性的认识,最后再贴一遍:
那我的任务就结束了,接下来的火炬交给何时夕:Android绘制机制以及Surface家族源码全解析
这篇是至今见过对Surface家族解释的最好的,建议理解熟读并背诵全文。
好了,本文就到这里,江湖路远,后会有期。再见。
我是张风捷特烈
,如果有什么想要交流的,欢迎留言。也可以加微信:zdl1994328
最后
以上就是个性大门为你收集整理的[-综合篇-] 相机、OpenGL、视频、Flutter和SurfaceView的全部内容,希望文章能够帮你解决[-综合篇-] 相机、OpenGL、视频、Flutter和SurfaceView所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
发表评论 取消回复