Flutter与Native通信(三)BasicMessageChannel

flutter可以native之间可以通过Platform Channels APIs进行通信,API主要有以下三种:

  • [MethodChanel]:用于传递方法调用(method invocation)
  • [EventChannel]:用于事件流的发送(event streams)
  • [BasicMessageChannel]:用于传递字符串和半结构化的消息

BasicMessageChannel用于在flutter和native互相发送消息,一方给另一方发送消息,收到消息之后给出回复。照例我们先看一下API的基本使用流程,然后再看代码实现

1.BasicMessageChannel的基本流程

flutter向native发送消息

  1. [flutter]创建BasicMessageChannel
  2. [native]通过BasicMessageChannel#MessageHandler注册Handler
  3. [flutter]通过BasicMessageChannel#send发送消息
  4. [native]BasicMessageChannel#MessageHandler#onMessage中接收消息,然后reply

native向flutter发送消息

流程也是一样的,只是将[flutter]与[native]反调

2.代码实现

flutter端

flutter需要完成以下工作

  • 创建BasicMessageChannel
  • 通过BasicMessageChannel#send发送消息

相对与其他Channel类型的创建,MessageChannel的创建除了channel名以外,还需要指定编码方式:

BasicMessageChannel(String name, MessageCodec<T> codec, {BinaryMessenger binaryMessenger})

发送的消息会以二进制的形式进行处理,所以要针对不同类型的数进行二进制编码

编码类型 消息格式
BinaryCodec 发送二进制消息时
JSONMessageCodec 发送Json格式消息时
StandardMessageCodec 发送基本型数据时
StringCodec 发送String类型消息时
class _MyHomePageState extends State<MyHomePage> {
  static const _channel = BasicMessageChannel('com.example.messagechannel/interop', StringCodec());
 
  String _platformMessage;
 
  void _sendMessage() async {
    final String reply = await _channel.send('Hello World form Dart');
    print(reply);
  }
 
  @override
  initState() {
    super.initState();
 
    // Receive messages from platform
    _channel.setMessageHandler((String message) async {
      print('Received message = $message');
      setState(() => _platformMessage = message);
      return 'Reply from Dart';
    });
 
    // Send message to platform
    _sendMessage();
  }

native(android)端

  • android端完成以下工作:
  • 创建BasicMessageChannel
  • 通过setHandler注册MessageHandler
  • MessageHandler#onMessage回调中接收到message后,通过reply进行回复
class MainActivity: FlutterActivity() {
    override fun configureFlutterEngine(@NonNull flutterEngine: FlutterEngine) {
        GeneratedPluginRegistrant.registerWith(flutterEngine)
 
        val channel = BasicMessageChannel(
                flutterEngine.dartExecutor.binaryMessenger,
                "com.example.messagechannel/interop",
                StringCodec.INSTANCE)
 
        // Receive messages from Dart
        channel.setMessageHandler { message, reply ->
            Log.d("Android", "Received message = $message")
            reply.reply("Reply from Android")
        }
 
        // Send message to Dart
        Handler().postDelayed({
            channel.send("Hello World from Android") { reply ->
                Log.d("Android", "$reply")
            }
        }, 500)
    }
}

Andorid端回复的消息会在Flutter端显示

07/23/2022 10:06 上午 posted in  Flutter

Flutter与Native通信(一)MethodChannel

flutter可以与native之间进行通信,帮助我们使用native提供的能力。通信是双向的,我们可以从Native层调用flutter层的dart代码,同时也可以从flutter层调用Native的代码。我们需要使用Platform Channels APIs进行通信,主要包括下面三种:

  • [MethodChanel]:用于传递方法调用(method invocation)
  • [EventChannel]:用于事件流的发送(event streams)
  • [MessageChannel]:用于传递字符串和半结构化的消息

其中最常用的是MethodChanel,MethodChanel的使用与在Android的JNI调用非常类似,但是MethodChanel更加简单,而且相对于JNI的同步调用MethodChanel的调用是异步的:

1. MethodChanel的基本流程

从flutter架构图上可以看到,flutter与native的通信发生在Framework和Engine之间,framewrok内部会将MethodChannel以BinaryMessage的形式与Engine进行数据交换。关于BinaryMessage在这里不做过多介绍,主要以介绍Channel的使用为主。

我们先看一下MethodChanel使用的基本流程:

flutter调用native

  1. [native] 使用MethodChannel#setMethodCallHandler注册回调
  2. [flutter] 通过MethodChannel#invokeMethod发起异步调用
  3. [native] 调用native方法通过Result#success 返回Result,出错时返回error
  4. [flutter] 收到native返回的Result

native调用flutter

与flutter调用native的顺序完全一致,只是[native]与[flutter]角色反调

2. 代码实现

flutter调用native

首先在flutter端实现以下功能:

  • 创建MethodChannel,并注册channel名,一般使用“包名/标识”作为channel名
  • 通过invokeMethod发起异步调用,invokeMethod接受两个参数:
    • method:调用的native方法名
    • arguments:nativie方法参数,有多个参数时需要以map形式指定
import 'package:flutter/services.dart';
 
class _MyHomePageState extends State<MyHomePage> {
  static const MethodChannel _channel = const MethodChannel('com.example.methodchannel/interop');
 
  static Future<dynamic> get _list async {
    final Map params = <String, dynamic> {
      'name': 'my name is hoge',
      'age': 25,
    };
    final List<dynamic> list = await _channel.invokeMethod('getList', params);
    return list;
  }
 
  @override
  initState() {
    super.initState();
 
    // Dart -> Platforms
    _list.then((value) => print(value));
  }

在native(android)端实现以下功能

  • 创建MethodChannel,必须跟flutter中使用相同的注册字符串
  • 设置MethodCallHander,methodCall中传递来自flutter的参数
  • 通过result返回给flutter结果
class MainActivity: FlutterActivity() {
    companion object {
        private const val CHANNEL = "com.example.methodchannel/interop"
        private const val METHOD_GET_LIST = "getList"
    }
 
    private lateinit var channel: MethodChannel
 
    override fun configureFlutterEngine(@NonNull flutterEngine: FlutterEngine) {
        GeneratedPluginRegistrant.registerWith(flutterEngine)
 
        channel = MethodChannel(flutterEngine.dartExecutor.binaryMessenger, CHANNEL)
        channel.setMethodCallHandler { methodCall: MethodCall, result: MethodChannel.Result ->
            if (methodCall.method == METHOD_GET_LIST) {
                val name = methodCall.argument<String>("name").toString()
                val age = methodCall.argument<Int>("age")
                Log.d("Android", "name = ${name}, age = $age")
 
                val list = listOf("data0", "data1", "data2")
                result.success(list)
            }
            else
                result.notImplemented()
        }
    }

因为结果返回是异步的,所以既可以像上面代码那样在MethodCallHandler里通过result.success返回结果,也也可以先保存result的引用,在之后的某个时间点再调用sucess,但需要特别注意的是无论何时调用result.sucess,必须确保其在UI线程进行:

@UiThread void success(@Nullable Object result)

native调用flutter

android调用flutter的代码实现与flutter调用android是类似的,只不过要注意所以的调用都要在UI线程进行。

先实现android部分的代码:

channel.invokeMethod("callMe", listOf("a", "b"), object : MethodChannel.Result {
    override fun success(result: Any?) {
        Log.d("Android", "result = $result")
    }
    override fun error(errorCode: String?, errorMessage: String?, errorDetails: Any?) {
        Log.d("Android", "$errorCode, $errorMessage, $errorDetails")
    }
    override fun notImplemented() {
        Log.d("Android", "notImplemented")
    }
})
result.success(null)

flutte部分则主要实现MethodCallHandler的注册:

Future<dynamic> _platformCallHandler(MethodCall call) async {
    switch (call.method) {
      case 'callMe':
        print('call callMe : arguments = ${call.arguments}');
        return Future.value('called from platform!');
        //return Future.error('error message!!');
      default:
        print('Unknowm method ${call.method}');
        throw MissingPluginException();
        break;
    }
  }
 
  @override
  initState() {
    super.initState();
 
    // Platforms -> Dart
    _channel.setMethodCallHandler(_platformCallHandler);
  }
07/23/2022 10:03 上午 posted in  Flutter

Flutter与Native通信(二): EventChannel

flutter可以native之间可以通过Platform Channels APIs进行通信,API主要有以下三种:

  • [MethodChanel]:用于传递方法调用(method invocation)
  • [EventChannel]:用于事件流的发送(event streams)
  • [MessageChannel]:用于传递字符串和半结构化的消息

其中EventChannel用于从native向flutter发送通知事件,例如flutter通过其监听Android的重力感应变化等。与MethodChannel不同,EventChannel是native到flutter的单向调用,调用是多播(一对多)的,可以类比成Android的Brodcast。

1. EventChannel的基本流程

我们照例先看一下API使用的基本流程:

  1. [native]EventChannel#setStreamHandler注册Handler实现
  2. [native]EventChannel初始化结束后,在StreamHandler#onLister回调中获取EventSink引用并保存
  3. [flutter]EventChannel#receiveBroadcastStream注册listener,建立监听
  4. [native]使用EventSink#sucess发送通知事件
  5. [flutter]接受到事件通知
  6. [native]通知结束时调用endOfStream结束

2.代码实现

flutter端

  • 创建EventChannel,注册“包名/标识符”的channel名
  • 通过StreamSubscription#listen注册listener,其中cancelOnError参数表示遇到错误时是否自动结束监听
class _MyHomePageState extends State<MyHomePage> {
  static const EventChannel _channel = const EventChannel('com.example.eventchannel/interop');
 
  StreamSubscription _streamSubscription;
  String _platformMessage;
 
  void _enableEventReceiver() {
    _streamSubscription = _channel.receiveBroadcastStream().listen(
        (dynamic event) {
          print('Received event: $event');
          setState(() {
            _platformMessage = event;
          });
        },
        onError: (dynamic error) {
          print('Received error: ${error.message}');
        },
        cancelOnError: true);
  }
 
  void _disableEventReceiver() {
    if (_streamSubscription != null) {
      _streamSubscription.cancel();
      _streamSubscription = null;
    }
  }
 
  @override
  initState() {
    super.initState();
    _enableEventReceiver();
  }
 
  @override
  void dispose() {
    super.dispose();
    _disableEventReceiver();
  }

调用StreamSubscriptoin#cancel时,监听被取消。

native(android)端

android需要完成以下功能

  • 通过EventChannel#setStreamHandler注册Handler实现
  • 初始化完成后,获取eventSink引用并保存
  • eventSink发送事件通知
  • 通知结束时调用event#endOfStream,此时onCancel会被调用
  • 必要时,可通过evnetSink#error发送错误通知,flutter的StreamSubscription#onError会收到通知
class MainActivity: FlutterActivity() {
    private lateinit var channel: EventChannel
    var eventSink: EventSink? = null
 
    override fun configureFlutterEngine(@NonNull flutterEngine: FlutterEngine) {
        GeneratedPluginRegistrant.registerWith(flutterEngine)
 
        channel = EventChannel(flutterEngine.dartExecutor.binaryMessenger, "com.example.eventchannel/interop")
        channel.setStreamHandler(
                object : StreamHandler {
                    override fun onListen(arguments: Any?, events: EventSink) {
                        eventSink = events
                        Log.d("Android", "EventChannel onListen called")
                        Handler().postDelayed({
                            eventSink?.success("Android")
                            //eventSink?.endOfStream()
                            //eventSink?.error("error code", "error message","error details")
                        }, 500)
                    }
                    override fun onCancel(arguments: Any?) {
                        Log.w("Android", "EventChannel onCancel called")
                    }
                })
    }
}
07/23/2022 10:01 上午 posted in  Flutter

Flutter的文件操作

path_provider获取应用存储路径

path_provider是flutter提供的一个获取应用存储路径的插件,它封装了统一的api来获取Android和ios两个平台的应用存储路径,提供的api如下:

  • getTemporaryDirectory():获取应用临时文件夹,该文件夹用来保存应用的缓存,可以随时删除用于清缓存,对应于Android的getCacheDir()和ios的NSTemporaryDirectory();
  • getApplicationDocumentsDirectory():获取应用安装路径,在应用被卸载的时候删除,对应Android的AppDate目录和iOS的NSDocumentDirectory目录;
  • getExternalStorageDirectory():获取存储卡目录,仅支持Android;

我们通过File和Directory来创建文件和文件夹时首先要获取到应用的相关路径,不然会报错;

File/Directory操作文件/文件夹

File对象和Directory对象封装在dart:io中,使用时需要先引入该库:

import 'dart:io';

创建文件/文件夹

// 创建一个文件夹

Directory tempDir = await getTemporaryDirectory();
  
Directory directory = new Directory('${tempDir.path}/test');

if (!directory.existsSync()) {
    directory.createSync();
    print('文档初始化成功,文件保存路径为 ${directory.path}');
}

// 创建一个文件

Directory tempDir = await getTemporaryDirectory();
  
File file = new File('${tempDir.path}/test.txt');

if (!file.existsSync()) {
    file.createSync();
    print('test.txt创建成功');
}

列出文件夹里的内容

Directory对象提供listSync()方法获取文件夹里的内容,该方法返回一个数组;

// 打印出test文件夹下文件的路径

Directory tempDir = await getTemporaryDirectory();
  
Directory directory = new Directory('${tempDir.path}/test');

directory.listSync().forEach((file) {
    print(file.path);
});

删除文件/文件夹

文件和文件夹都通过delete删除,delete异步,deleteSync同步;如果一个文件夹是非空的删除会报错,删除非空文件夹需要先清空该文件夹:

Directory directory = new Directory(path);

if (directory.existsSync()) {
    List<FileSystemEntity> files = directory.listSync();

    if (files.length > 0) {
      files.forEach((file) {
        file.deleteSync();
      });
    }
    
    directory.deleteSync();
}

读取/写入文件

File file = new File('${cache}/test.txt');

// 读物文件内容
String content = file.readAsString();
print(content);

// 写入文件
file.writeAsString('文件内容');

json文件读写

flutter对json序列化需要引入 dart:convert 库:

import 'dart:convert' as convert;

通过jsonEncode/jsonDecode来转换json对象:

var json = {
    'name': 'xiaoming',
    'age': 22,
    'address': 'hangzhou'
}

File jsonFile = new File('$cahce/test.json');

// json文件写入
jsonFile.writeAsString(convert.jsonEncode(json));

// json文件读取
var jsonStr = await jsonFile.readAsString();
var json = convert.jsonDecode(jsonStr);

print(json['name']); // xiaoming
print(json['age']); // 22
print(json['address']); // hangzhou

文件的拷贝

// 将test目录下的info.json复制到test2目录下的info2.json中
File info1 = new File('$cache/test/info.json');

info1.copySync('$cache/test2/info2.json');

archive插件压缩文件

引入包archive包:

import 'package:archive/archive.dart';
import 'package:archive/archive_io.dart';

压缩:

var encode = ZipFileEncoder();
encode.zipDirectory(path, filename: path + '.zip');

encode.close();
import 'package:archive/archive.dart';

void main() {
  Directory appDocDirectory = await getExternalStorageDirectory();
  var encoder = ZipFileEncoder();
  encoder.create(appDocDirectory.path+"/"+'jay.zip');
  encoder.addFile(File(selectedAdharFile));
  encoder.addFile(File(selectedIncomeFile));
  encoder.close();
}

压缩前使用ZipFileEncoder先声明处理压缩的对象,调用该对象的zipDirectory方法压缩文件,该方法接受两个参数,第一个是要压缩文件/文件夹的路径,第二个是压缩包的保存路径;

解压:

List<int> bytes = File('test.zip').readAsBytesSync();

Archive archive = ZipDecoder().decodeBytes(bytes);
06/29/2022 18:04 下午 posted in  Flutter

flutter纹理之iOS

需求播放h264格式的视频,在iOS端进行解码得到CVpixelbuffer,在flutter上播放,用到纹理。

flutter端:

1,创建一个MethodChannel _channel = MethodChannel(‘opengl_texture’);
用来和iOS端通信,主要是从iOS端获取_textureID,
2,把Texture(textureId: _textureID,)添加到widget上,这个用来播放视频的,原理是从底层获取iOS端的CVpixelbuffer,
把CVpixelbuffer渲染到flutter页面上。
代码如下:

import 'package:flutter/material.dart';
import 'package:flutter/services.dart';

void main() => runApp(MyApp());

class MyApp extends StatelessWidget {
  // This widget is the root of your application.
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Flutter Demo',
      theme: ThemeData(
        primarySwatch: Colors.blue,
      ),
      home: MyHomePage(title: 'Flutter Demo Home Page'),
    );
  }
}

class MyHomePage extends StatefulWidget {
  MyHomePage({Key key, this.title}) : super(key: key);
  final String title;
  @override
  _MyHomePageState createState() => _MyHomePageState();
}

class _MyHomePageState extends State<MyHomePage> {

  MethodChannel _channel = MethodChannel('com.ios.texture');

  bool _isTextureOK = false;
  int _textureID = -1;

  @override
  void initState() {
    super.initState();

  }

  void getTexture() async {
    _textureID = await _channel.invokeMethod('newTexture');
    setState(() {
      _isTextureOK = true;
    });
  }

  Widget getTextureWidget(BuildContext context) {
    return Container(
      // color: Colors.red,
      width: 300,
      height: 300,
      child: Texture(textureId: _textureID,),
    );
  }

  @override
  Widget build(BuildContext context) {
    
    return Scaffold(
      appBar: AppBar(
        title: Text(widget.title),
      ),
      body: Stack(
        children: [
          Positioned.fill(
              child: Center(
                //在这里加载纹理Texture
                child: _isTextureOK ? getTextureWidget(context) : Text('video'),
              )
          ),
          Positioned(
            left: 0,
              bottom: 0,
              child: FlatButton(
              onPressed: (){
                getTexture();
              },
              child: Text("getTexture")
          )),
          Positioned(
            right: 0,
              bottom: 0,
              child: FlatButton(
              onPressed: (){
                _channel.invokeMethod('open');
              },
              child: Text("open camera")
          )),
        ],
      ),
    );
  }
}

iOS端:

  1. 定义一个TexturePlugin类,并实现FlutterPlugin协议,FlutterPlugin是一个插件协议,实现该协议可以自定义一个插件。实现FlutterPlugin协议的类方法:

- (void)registerWithRegistrar:(NSObject FlutterPluginRegistrar \*)registrar

  1. 在AppDelegate里,AppDelegate继承FlutterAppDelegate类,在FlutterAppDelegate类继承UIResponder并实现 UIApplicationDelegate, FlutterPluginRegistry, FlutterAppLifeCycleProvider协议
#import "AppDelegate.h"
#import "TexturePlugin.h"
@implementation AppDelegate
- (BOOL)application:(UIApplication *)application
    didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    [TexturePlugin registerWithRegistrar:[self registrarForPlugin:@"TexturePlugin"]];
  return [super application:application didFinishLaunchingWithOptions:launchOptions];
}
@end
  1. 创建一个GLRender类,并实现FlutterTexture;实现协议的方法,把cvpixelbuffer赋值给_target属性,
  • (CVPixelBufferRef)copyPixelBuffer {
    // 实现FlutterTexture协议的接口,每次flutter是直接读取我们映射了纹理的pixelBuffer对象
    return _target;
    }

TexturePlugin.h文件

#import <Flutter/Flutter.h>

NS_ASSUME_NONNULL_BEGIN

@interface TexturePlugin : NSObject <FlutterPlugin>

@end

NS_ASSUME_NONNULL_END

TexturePlugin.m文件

#import "TexturePlugin.h"
#import "GLRender.h"
#import "ViddeoController.h"
@interface TexturePlugin ()<ViddeoControllerDelegate>
{
    ViddeoController *video ;
    int64_t _textureId;//这个是创建纹理得到的ID
}
@property (nonatomic, strong) NSObject<FlutterTextureRegistry> *textures;
@property (nonatomic, strong) GLRender *glRender;

@end

@implementation TexturePlugin

- (instancetype) initWithTextures:(NSObject<FlutterTextureRegistry> *)textures {
    if (self = [super init]) {
        video = [[ViddeoController alloc] init];
        video.delegate = self;
        _textures = textures;
    }
    return self;
}

//协议方法
+ (void)registerWithRegistrar:(NSObject<FlutterPluginRegistrar>*)registrar {
    //创建一个FlutterMethodChannel,用来和flutter通信。
    FlutterMethodChannel *channel = [FlutterMethodChannel methodChannelWithName:@"com.ios.texture" binaryMessenger:[registrar messenger]];
    //创建这个插件对象,把实现了<FlutterPluginRegistrar>协议的对象传给TexturePlugin
    TexturePlugin *instance = [[TexturePlugin alloc] initWithTextures:registrar.textures];
    //把channel的代理设置给instance;
    [registrar addMethodCallDelegate:instance channel:channel];
}
//FlutterMethodChannel代理,
- (void)handleMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result
{
    if ([call.method isEqualToString:@"newTexture"]) {
  //收到flutter获取纹理的信号
        
        _glRender = [[GLRender alloc] init];
        //生成textureId
        _textureId = [_textures registerTexture:_glRender];
        //把textureId反馈给flutter
        result(@(_textureId));
    }else if ([call.method isEqualToString:@"open"]){
        //开启手机摄像头,
        [video cameraButtonAction:YES];
        
    }
}
//把摄像头的视频封装成CVpixelBufferRef
- (void)video:(CVImageBufferRef)imageBuffer
{
    [_glRender createCVBufferWith:imageBuffer];
    //刷新frame,告诉flutter去读取新的CVpixelBufferRef
    [self.textures textureFrameAvailable:_textureId];
//    CVPixelBufferRelease(imageBuffer);
}

@end

GLRender.h文件

#import <Foundation/Foundation.h>
#import <Flutter/Flutter.h>

NS_ASSUME_NONNULL_BEGIN

@interface GLRender : NSObject <FlutterTexture>

- (instancetype)init;

- (void)createCVBufferWith:(CVPixelBufferRef )target;
@end

NS_ASSUME_NONNULL_END

GLRender.m文件

@implementation GLRender
{
    CVPixelBufferRef _target;
}

- (CVPixelBufferRef)copyPixelBuffer {
    // 实现FlutterTexture协议的接口,每次flutter是直接读取我们映射了纹理的pixelBuffer对象
    return _target;
}
- (void)createCVBufferWith:(CVPixelBufferRef )target
{
    _target = target;
}

合并后

//
//  TexturePlugin.m
//  Runner
//
//  Created by jonasluo on 2019/12/11.
//  Copyright © 2019 The Chromium Authors. All rights reserved.
//

#import "TexturePlugin.h"
#import "ViddeoController.h"

@interface GLTexture : NSObject<FlutterTexture>
@property(nonatomic)CVPixelBufferRef target;
@end

@implementation GLTexture

- (CVPixelBufferRef)copyPixelBuffer {
    // 实现FlutterTexture协议的接口,每次flutter是直接读取我们映射了纹理的pixelBuffer对象
    return _target;
}
@end

@interface TexturePlugin ()<ViddeoControllerDelegate,FlutterPlugin>
{
    ViddeoController *video ;//用来把摄像头的视频转码成cvpixelbuffer
    int64_t _textureId;
    GLTexture *_glTexture;
}
@property (nonatomic, strong) NSObject<FlutterTextureRegistry> *textures;//其实是FlutterEngine

@end

@implementation TexturePlugin

- (instancetype) initWithTextures:(NSObject<FlutterTextureRegistry> *)textures {
    if (self = [super init]) {
        video = [[ViddeoController alloc] init];
        video.delegate = self;
        _textures = textures;
    }
    return self;
}

+ (void)registerWithRegistrar:(NSObject<FlutterPluginRegistrar>*)registrar {
    
    FlutterMethodChannel *channel = [FlutterMethodChannel methodChannelWithName:@"com.ios.texture" binaryMessenger:[registrar messenger]];
    
    TexturePlugin *instance = [[TexturePlugin alloc] initWithTextures:registrar.textures];
    
    [registrar addMethodCallDelegate:instance channel:channel];
}

- (void)handleMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result
{
    if ([call.method isEqualToString:@"newTexture"]) {
  
        _glTexture = [[GLTexture alloc] init];
        _textureId = [_textures registerTexture:_glTexture];
        result(@(_textureId));
    }else if ([call.method isEqualToString:@"open"]){
        [video cameraButtonAction:YES];
    }
}

- (void)video:(CVImageBufferRef)imageBuffer
{
    _glTexture.target = imageBuffer;
    [self.textures textureFrameAvailable:_textureId];
}

@end

另一个共享cvpixelbuffer的

#import "GLRender.h"
#import <OpenGLES/EAGL.h>
#import <OpenGLES/ES2/gl.h>
#import <OpenGLES/ES2/glext.h>
#import <CoreVideo/CoreVideo.h>
#import <UIKit/UIKit.h>

@implementation GLRender
{
    
    EAGLContext *_context;
    CGSize _size;
    CVOpenGLESTextureCacheRef _textureCache;
    CVOpenGLESTextureRef _texture;
    CVPixelBufferRef _target;
    GLuint _program;
    GLuint _frameBuffer;
}

- (CVPixelBufferRef)copyPixelBuffer {
    // 实现FlutterTexture协议的接口,每次flutter是直接读取我们映射了纹理的pixelBuffer对象
    return _target;
}

- (instancetype)init
{
    if (self = [super init]) {
        _size = CGSizeMake(1000, 1000);
        
        [self initGL];
        [self loadShaders];
    }
    return self;
}

- (void)initGL {
    _context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    [EAGLContext setCurrentContext:_context];
    // 先调用上面的函数创建共享内存的pixelBuffer和texture对象
    [self createCVBufferWith:&_target withOutTexture:&_texture];
        
    // 创建帧缓冲区
    glGenFramebuffers(1, &_frameBuffer);
    glBindFramebuffer(GL_FRAMEBUFFER, _frameBuffer);
    
    // 将纹理附加到帧缓冲区上
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(_texture), 0);
    
    glViewport(0, 0, _size.width, _size.height);
    
    if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) {
        NSLog(@"failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
    }
}

- (void)createCVBufferWith:(CVPixelBufferRef *)target withOutTexture:(CVOpenGLESTextureRef *)texture {
    // 创建纹理缓存池,这个不是重点
    CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, _context, NULL, &_textureCache);
    if (err) {
        return;
    }
    
    CFDictionaryRef empty;
    CFMutableDictionaryRef attrs;
    empty = CFDictionaryCreate(kCFAllocatorDefault, NULL, NULL, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);
    attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);
    // 核心参数是这个,共享内存必须要设置这个kCVPixelBufferIOSurfacePropertiesKey
    CFDictionarySetValue(attrs, kCVPixelBufferIOSurfacePropertiesKey, empty);
    // 分配pixelBuffer对象的内存,注意flutter需要的是BGRA格式
    CVPixelBufferCreate(kCFAllocatorDefault, _size.width, _size.height, kCVPixelFormatType_32BGRA, attrs, target);
    // 映射上面的pixelBuffer对象到一个纹理上
    CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, *target, NULL, GL_TEXTURE_2D, GL_RGBA, _size.width, _size.height, GL_BGRA, GL_UNSIGNED_BYTE, 0, texture);
    
    CFRelease(empty);
    CFRelease(attrs);
}

- (void)deinitGL {
    glDeleteFramebuffers(1, &_frameBuffer);
    CFRelease(_target);
    CFRelease(_textureCache);
    CFRelease(_texture);
}

- (void)createCVBufferWith:(CVPixelBufferRef )target
{
    _target = target;
}


#pragma mark - shader compilation
- (BOOL)loadShaders
{
    GLuint vertShader, fragShader;
    NSString *vertShaderPathname, *fragShaderPathname;
    
    _program = glCreateProgram();
    
    vertShaderPathname = [[NSBundle mainBundle] pathForResource:@"Shader" ofType:@"vsh"];
    if (![self compileShader:&vertShader type:GL_VERTEX_SHADER file:vertShaderPathname]) {
        NSLog(@"failed to compile vertex shader");
        return NO;
    }
    
    fragShaderPathname = [[NSBundle mainBundle] pathForResource:@"Shader" ofType:@"fsh"];
    if (![self compileShader:&fragShader type:GL_FRAGMENT_SHADER file:fragShaderPathname]) {
        NSLog(@"failed to compile fragment shader");
        return NO;
    }
    
    glAttachShader(_program, vertShader);
    glAttachShader(_program, fragShader);
    
    if (![self linkProgram:_program]) {
        NSLog(@"failed to link program: %d", _program);
        
        if (vertShader) {
            glDeleteShader(vertShader);
            vertShader = 0;
        }
        if (fragShader) {
            glDeleteShader(fragShader);
            fragShader = 0;
        }
        if (_program) {
            glDeleteProgram(_program);
            _program = 0;
        }
        return NO;
    }
    
    if (vertShader) {
       glDetachShader(_program, vertShader);
       glDeleteShader(vertShader);
    }
    if (fragShader) {
       glDetachShader(_program, fragShader);
       glDeleteShader(fragShader);
    }
    
    NSLog(@"load shaders succ");
    return YES;
}

- (BOOL)compileShader:(GLuint *)shader type:(GLenum)type file:(NSString *)file
{
    GLint status;
    const GLchar *source;
    
    source = (GLchar*)[[NSString stringWithContentsOfFile:file encoding:NSUTF8StringEncoding error:nil] UTF8String];
    if (!source) {
        NSLog(@"failed to load shader. type: %i", type);
        return NO;
    }
    
    *shader = glCreateShader(type);
    glShaderSource(*shader, 1, &source, NULL);
    glCompileShader(*shader);
    
    #if defined(DEBUG)
       GLint logLength;
       glGetShaderiv(*shader, GL_INFO_LOG_LENGTH, &logLength);
       if (logLength > 0) {
          GLchar *log = (GLchar *)malloc(logLength);
          glGetShaderInfoLog(*shader, logLength, &logLength, log);
          NSLog(@"Shader compile log:\n%s", log);
          free(log);
       }
    #endif
    
    glGetShaderiv(*shader, GL_COMPILE_STATUS, &status);
    if (status == 0) {
       glDeleteShader(*shader);
       return NO;
    }
    
    return YES;
}

- (BOOL)linkProgram:(GLuint)prog
{
    GLint status;
    glLinkProgram(prog);
    
    glGetProgramiv(prog, GL_LINK_STATUS, &status);
    if (status == 0) {
       return NO;
    }
    
    return YES;
}

- (BOOL)validateProgram:(GLuint)prog
{
    GLint logLength, status;
    glValidateProgram(prog);
    glGetProgramiv(prog, GL_INFO_LOG_LENGTH, &logLength);
    if (logLength > 0) {
        GLchar *log = (GLchar *)malloc(logLength);
        glGetProgramInfoLog(prog, logLength, &logLength, log);
        NSLog(@"program validate log : \n%s", log);
        free(log);
    }
    
    glGetProgramiv(prog, GL_VALIDATE_STATUS, &status);
    if (status == 0) {
        return NO;
    }
    
    return YES;
}

@end
06/28/2022 09:47 上午 posted in  Flutter

Flutter 压缩图像的最佳方式

引言

作为开发者的我们,经常会做一些上传图片和和保存图片啦的功能,但是由于一些图片非常大,我们在上传或者保存的时候会占用大量的网络资源和本地资源,那么我们需要做的就是对图片进行压缩。

昨天在写

最新Flutter 微信分享功能实现【Flutter专题23】​mp.weixin.qq.com/s/PGpgau6mJLAbfKMVYqTuOg

的时候用到一个知识点,就是图片压缩

当时我用了flutter_image_compress

可能大家都知道Dart 已经有图片压缩库了。为什么要使用原生?

还不是因为他的效率问题,

所以今天就和大家来说一说它的具体用法吧。

1.flutter_image_compress

安装

dependencies:
  flutter_image_compress: ^1.0.0-nullsafety

使用的地方导入

import 'package:flutter_image_compress/flutter_image_compress.dart';
  /// 图片压缩 File -> Uint8List
  Future<Uint8List> testCompressFile(File file) async {
    var result = await FlutterImageCompress.compressWithFile(
      file.absolute.path,
      minWidth: 2300,
      minHeight: 1500,
      quality: 94,
      rotate: 90,
    );
    print(file.lengthSync());
    print(result.length);
    return result;
  }
​
  /// 图片压缩 File -> File
  Future<File> testCompressAndGetFile(File file, String targetPath) async {
    var result = await FlutterImageCompress.compressAndGetFile(
        file.absolute.path, targetPath,
        quality: 88,
        rotate: 180,
      );
​
    print(file.lengthSync());
    print(result.lengthSync());
​
    return result;
  }
​
  /// 图片压缩 Asset -> Uint8List
  Future<Uint8List> testCompressAsset(String assetName) async {
    var list = await FlutterImageCompress.compressAssetImage(
      assetName,
      minHeight: 1920,
      minWidth: 1080,
      quality: 96,
      rotate: 180,
    );
​
    return list;
  }
​
​
  /// 图片压缩 Uint8List -> Uint8List
  Future<Uint8List> testComporessList(Uint8List list) async {
    var result = await FlutterImageCompress.compressWithList(
      list,
      minHeight: 1920,
      minWidth: 1080,
      quality: 96,
      rotate: 135,
    );
    print(list.length);
    print(result.length);
    return result;
  }

还有另外两种方式

2.使用 image_picker 包的 imageQuality 参数

图像选择器

3.使用 flutter_native_image 包

flutter_native_image

安装

flutter_native_image: ^0.0.6

文档地址

https://pub.flutter-io.cn/packages/flutter_native_image

用法

Future<File> compressFile(File file) async{
    File compressedFile = await FlutterNativeImage.compressImage(file.path,
        quality: 5,);
    return compressedFile;
  }

关于如何计算所选文件的图像大小的吗?

您可以以字节为单位获取文件长度,并以千字节或兆字节等计算。

像这样:file.readAsBytesSync().lengthInBytes -> 文件大小以字节为单位的文件大小

(file.readAsBytesSync().lengthInBytes) / 1024 -> 文件大小以千字节为单位的文件大小

(file.readAsBytesSync().lengthInBytes) / 1024 / 1024 -> 文件大小以兆字节为单位

总结

今天的文章介绍了图片压缩的三种用法,分别对应三个不同的库,大家可以去实践,来对比一下那个库的性能更好。

好的,我是坚果,

如何在 Flutter 中创建自定义图标【Flutter专题22】​mp.weixin.qq.com/s/1h19t1EAaGTmrFI8gaDLWA

有更多精彩内容,期待你的发现.

06/27/2022 15:12 下午 posted in  Flutter

flutter_bloc 使用解析

flutter_bloc 使用将从下图的三个维度说明

初始化代码

来看下这三个生成的 bloc 文件:main_bloc,main_event,main_state

main_bloc:这里就是咱们主要写逻辑的页面了mapEventToState 方法只有一个参数,后面自动带了一个逗号,格式化代码就分三行了,建议删掉逗号,格式化代码。

class MainBloc extends Bloc<MainEvent, MainState> {
  MainBloc() : super(MainInitial());

  @override
  Stream<MainState> mapEventToState(
    MainEvent event,
  ) async* {
    // TODO: implement mapEventToState
  }
}

main_event:这里是执行的各类事件,有点类似 fish_redux 的 action 层

@immutable
abstract class MainEvent {}

main_state:状态数据放在这里保存,中转

@immutable
abstract class MainState {}

class MainInitial extends MainState {}

实现

主入口

void main() {
  runApp(MyApp());
}

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: MainPage(),
    );
  }
}

说明

这里对于简单的页面,state 的使用抽象状态继承实现的方式,未免有点麻烦,这里我进行一点小改动,state 的实现类别有很多,官网写 demo 也有不用抽象类,直接 class,类似实体类的方式开搞的。

相关代码的注释写的比较多,大家可以着重看看

main_bloc
state 变量是框架内部定义的,会默认保存上一次同步的 MainSate 对象的值

class MainBloc extends Bloc<MainEvent, MainState> {
  MainBloc() : super(MainState(selectedIndex: 0, isExtended: false));

  @override
  Stream<MainState> mapEventToState(MainEvent event) async* {
    ///main_view中添加的事件,会在此处回调,此处处理完数据,将数据yield,BlocBuilder就会刷新组件
    if (event is SwitchTabEvent) {
      ///获取到event事件传递过来的值,咱们拿到这值塞进MainState中
      ///直接在state上改变内部的值,然后yield,只能触发一次BlocBuilder,它内部会比较上次MainState对象,如果相同,就不build
      yield MainState()
        ..selectedIndex = event.selectedIndex
        ..isExtended = state.isExtended;
    } else if (event is IsExtendEvent) {
      yield MainState()
        ..selectedIndex = state.selectedIndex
        ..isExtended = !state.isExtended;
    }
  }
}

全局 Bloc

说明

什么是全局 Bloc?

BlocProvider 介绍里面有这样的形容:BlocProvider should be used to create new blocs which will be made available to the rest of the subtree(BlocProvider 应该被用于创建新的 Bloc,这些 Bloc 将可用于其子树)

这样的话,我们只需要在主入口地方使用 BlocProvider 创建 Bloc,就能使用全局的 XxxBloc 了,这里的全局 XxxBloc,state 状态都会被保存的,除非关闭 app,否则 state 里面的数据都不会被还原!

注意:在主入口创建的 XxxBloc,在主入口处创建了一次,在其它页面均不需要再次创建,在任何页面只需要使用 BlocBuilder,便可以定点刷新及其获取全局 XxxBloc 的 state 数据

使用场景

全局的主题色,字体样式和大小等等全局配置更改;这种情况,在需要全局属性的地方,使用 BlocBuilder 对应的全局 XxxBloc 泛型去刷新数据就行了

跨页面去调用事件,既然是全局的 XxxBloc,这就说明,我们可以在任何页面,使用 BlocProvider.of(context)调用全局 XxxBloc 中事件,这就起到了一种跨页面调用事件的效果

使用全局 Bloc 做跨页面事件时,应该明白,当你关闭 Bloc 对应的页面,对应全局 Bloc 中的并不会被回收,下次进入页面,页面的数据还是上次退出页面修改的数据,这里应该使用 StatefulWidget,在 initState 生命周期处,初始化数据;或者在 dispose 生命周期处,还原数据源

思考下:全局 Bloc 对象存在周期是在整个 App 存活周期,必然不能创建过多的全局 Bloc,跨页面传递事件使用全局 Bloc 应当只能做折中方案

06/22/2022 07:54 上午 posted in  Flutter

对于Flutter中BLoC架构的几个实例代码

flutter_bloc 使用将从下图的三个维度说明

MultiBlocProvider的使用


class HomeWidget extends StatelessWidget {
  const HomeWidget({Key? key}) : super(key: key);

  @override
  Widget build(BuildContext context) {
    return MultiBlocProvider(providers: [
      BlocProvider<HomeBloc>(
        create: (context) => HomeBloc(),
      ),
      BlocProvider<HomeTreatmentCubit>(
        create: (context) => HomeTreatmentCubit(),
      ),
      BlocProvider<HomeOralInspectionCubit>(
        create: (context) => HomeOralInspectionCubit(),
      ),
      BlocProvider<HomeSmileSolutionCubit>(
        create: (context) => HomeSmileSolutionCubit(),
      ),
    ], child: const HomePage());
  }
}

class HomePage extends StatefulWidget {
  const HomePage({Key? key}) : super(key: key);

  @override
  State<HomePage> createState() => _HomePageState();
}
class _HomePageState extends State<HomePage> {
  @override
  void initState() {
    super.initState();
  }

  // _onRefresh 下拉刷新回调
  Future _onRefresh() async {
    Log.d("HomePage execu refresh");
    //HomeBloc homeBloc = BlocProvider.of<HomeBloc>(context);
    //结束刷新
    return Future.value(true);
  }

  @override
  Widget build(BuildContext context) {
    ScrollController scrollController = ScrollController();

    EdgeInsets paddings = MediaQuery.of(context).padding;
    return Scaffold(
      appBar: AppBar(
        backgroundColor: ColorT.appBarBackground,
        leading: Container(),
        title: const Text(
          "首页",
          style: TextStyle(
              fontSize: 18,
              color: ColorT.appBarTitle,
              fontWeight: FontWeight.bold),
        ),
        elevation: 0,
      ),
      body: SafeArea(
        top: false,
        bottom: true,
        left: true,
        right: false,
        child: Container(
          color: ColorT.primaryBackground,
          margin: const EdgeInsets.fromLTRB(0, 0, 0, 0),
          padding: EdgeInsets.fromLTRB(10, 10, 10, paddings.bottom),
          child: RefreshIndicator(
            onRefresh: _onRefresh,
            displacement: 40,
            child: ListView(
              controller: scrollController,
              padding: const EdgeInsets.fromLTRB(0, 0, 0, 0),
              shrinkWrap: true,
              children: const <Widget>[
                HomeTreatmentWidget(),
                HomeOralInspectionWidget(),
                HomeSmileSolutionWidget()
              ],
            ),
          ),
        ),
      ),
    );
  }
}

06/22/2022 07:49 上午 posted in  Flutter

Flutter项目高德地图后台持续定位功能的实现(iOS)

首先高德本身就支持后台持续定位:实例文档.对于Flutter项目高德也提供了框架支持:文档

pubspec.yaml如下:

dependencies:
  flutter:
    sdk: flutter
  # 权限相关
  permission_handler: ^5.1.0+2
  # 定位功能
  amap_location_fluttify: ^0.20.0

实现逻辑我们以iOS项目为例:

iOS项目工程(ios/Runner)配置:

添加定位权限申请配置

<key>NSLocationAlwaysAndWhenInUseUsageDescription</key>
<string>申请Always权限以便应用在前台和后台(suspend 或 terminated)都可以获取到更新的位置数据</string>
<key>NSLocationAlwaysUsageDescription</key>
<string>需要您的同意才能始终访问位置</string>
<key>NSLocationWhenInUseUsageDescription</key>
<string>需要您的同意,才能在使用期间访问位置</string>

以上权限会根据iOS 系统版本的不同有所不同

后台任务(Background Modes)模式配置

<key>UIBackgroundModes</key>
<array>
	<string>location</string>
	<string>remote-notification</string>
</array>

选择Location updates选项

Flutter项目实例

对于Flutter中的使用方法,具体实例如下:

  1. 首先要在main函数中进行高德地图组件的注册
  2. 视图中在调用定位之前必须进行权限申请
  3. 开启后台任务功能
  4. 执行持续定位

代码 main.dart:

import 'package:amap_location_fluttify/amap_location_fluttify.dart';

void main() {
  runApp(const MyApp());
  # 注册高德地图组件
  AmapLocation.instance.init(iosKey: 'xxxxxx');
}

class MyApp extends StatelessWidget {
  const MyApp({Key? key}) : super(key: key);

  // This widget is the root of your application.
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Flutter Demo',
      theme: ThemeData(
        primarySwatch: Colors.blue,
      ),
      home: LocationPage(),
    );
  }
}

location_page.dart:

import 'package:amap_location_fluttify/amap_location_fluttify.dart';
import 'package:flutter/material.dart';
import 'dart:async';
import 'package:permission_handler/permission_handler.dart';

class LocationPage extends StatefulWidget {
  LocationPage({Key? key}) : super(key: key);

  _LocationPageState createState() => _LocationPageState();
}

class _LocationPageState extends State<LocationPage> {
  //获取数据
  // Map<String, Object> _locationResult;
  String _latitude = ""; //纬度
  String _longitude = ""; //经度

  @override
  void initState() {
    super.initState();
    /// 动态申请定位权限
    requestPermission();
  }

  @override
  void dispose() {
    super.dispose();
  }

  /// 动态申请定位权限
  void requestPermission() async {
    // 申请权限
    bool hasLocationPermission = await requestLocationPermission();
    if (hasLocationPermission) {
      print("定位权限申请通过");
    } else {
      print("定位权限申请不通过");
    }
  }

  ///  申请定位权限  授予定位权限返回true, 否则返回false
  Future<bool> requestLocationPermission() async {
    //获取当前的权限

    var status = await Permission.locationAlways.status;
    if (status == PermissionStatus.granted) {
      //已经授权
      return true;
    } else {
      //未授权则发起一次申请
      status = await Permission.location.request();
      if (status == PermissionStatus.granted) {
        return true;
      } else {
        return false;
      }
    }
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: Text("地理定位演示"),
      ),
      body: Center(
        child: Column(
          mainAxisAlignment: MainAxisAlignment.center,
          children: [
            //  latitude: 36.570091461155336, longitude: 109.5080830206976
            //
            Text("纬度:${this._latitude}"),
            Text("经度:${this._longitude}"),
            SizedBox(height: 20),
            ElevatedButton(
              child: Text('开始定位'),
              onPressed: () {
                this._startTheLocation();
              },
            ),
          ],
        ),
      ),
    );
  }

  Future _startTheLocation() async {
    if (await Permission.location.request().isGranted) {
    
        # 开启后台持续定位功能
      await AmapLocation.instance.enableBackgroundLocation(
        10,
        BackgroundNotification(
          contentTitle: 'contentTitle',
          channelId: 'channelId',
          contentText: 'contentText',
          channelName: 'channelName',
        ),
      );
      
      # 监听持续定位
      AmapLocation.instance.listenLocation().listen((location) {
        setState(() {
          _latitude = location.latLng.latitude.toString();
          _longitude = location.latLng.longitude.toString();
          print("监听定位: {$_latitude, $_longitude}");
        });
      });
    } else {
      openAppSettings();
    }
  }

}

总结

关于后台持续定位对于高德来说核心函数只有2个:

开启后台任务

AmapLocation.instance.enableBackgroundLocation(id, notification)

执行持续定位:

AmapLocation.instance.listenLocation().listen((location) {
    // do someting
});
06/10/2022 12:10 下午 posted in  Flutter

Flutter开发的一些知识点记录1

Error: Cannot run with sound null safety, because the following dependencies

don't support null safety:

flutter build ios --no-sound-null-safety

Flutter 升级到指定版本——版本升级与回退

相关命令:

查看版本: flutter --version
检查环境:flutter doctor
查看渠道:flutter channel
切换渠道(stable, beta, dev, master):flutter channel stable
升级到最新版本:flutter upgrade 
升级到指定版本:flutter upgrade v2.2.3
回退到指定版本:flutter downgrade v2.0.3

也可通过git回退版本:

  1. 进入 flutter github 找到要回退的版本


  1. cd进入到存放flutter sdk目录,运行回退指令 git reset --hard [commit_id]
    例如-> git reset --hard 4d7946a68d26794349189cf21b3f68cc6fe61dcb
  2. 查看flutter版本
    查看版本-> flutter doctor 或者 flutter --version
06/09/2022 15:26 下午 posted in  Flutter