由于公司的日志系统使用的是plumelog,最近生产环境老是报 jedis连接池不够,导致丢失日志,而且服务老是重启,怀疑跟日志系统有关,于是自己改造plumelog,使用go grpc生成server端,使用java grpc生成客户端,将日志以grpc服务形式传递到server端。
选择自己所需版本,解压后将protoc.exe拷贝至go环境的bin目录下
go install google.golang.org/protobuf/cmd/protoc-gen-go@v1.28
go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@v1.2
将生成的插件拷贝至go环境的bin目录下
编写.proto文件:
syntax = "proto3";// 指定等会文件生成出来的package
package server;
option go_package = "plumelog/rpc;server";// 定义request model
message PlumelogRequest{string message = 1; // 1代表顺序
}// 定义response model
message PlumelogResponse{string message = 1; // 1代表顺序
}// 定义服务主体
service PlumelogService{// 定义方法rpc GetPlumelog(PlumelogRequest) returns(PlumelogResponse);
}
项目结构图:
在终端cd到proto目录下,执行如下命令生成grpc代码:
protoc --go_out=plugins=grpc:. server.proto
main.go:
import ("google.golang.org/grpc""log""net""plumelog/rpc""plumelog/server"
)func main() {// 1. new一个grpc的serverrpcServer := grpc.NewServer()// 2. 将刚刚我们新建的ProdService注册进去rpc.RegisterPlumelogServiceServer(rpcServer, new(server.RpcServer))// 3. 新建一个listener,以tcp方式监听8899端口listener, err := net.Listen("tcp", ":8899")if err != nil {log.Fatal("服务监听端口失败", err)}// 4. 运行rpcServer,传入listener_ = rpcServer.Serve(listener)
}
server.go
package serverimport ("context""plumelog/rpc"
)type RpcServer struct {
}var pushProducer *plumelog.Producerfunc (*RpcServer) GetProductStock(ctx context.Context, req *rpc.PlumelogRequest) (*rpc.PlumelogResponse, error) {fmt.Println(req.Message)return &rpc.PlumelogResponse{Message: req.Message}, nil
}
pom.xml:
com.plumelog plumelog 3.5 4.0.0 plumelog-logback plumelog-logback jar ch.qos.logback logback-core 1.2.3 ch.qos.logback logback-classic 1.2.3 com.plumelog plumelog-core ${project.parent.version} com.google.protobuf protobuf-java 3.5.1 com.google.protobuf protobuf-java-util 3.5.1 io.grpc grpc-all 1.12.0 javax.annotation javax.annotation-api 1.3.2 compile javax.annotation javax.annotation-api 1.3.2 compile javax.annotation javax.annotation-api 1.3.2 compile org.apache.maven.plugins maven-dependency-plugin copy-dependencies package copy-dependencies ${project.build.directory}/lib false false org.xolstice.maven.plugins protobuf-maven-plugin 0.5.0 com.google.protobuf:protoc:3.1.0:exe:${os.detected.classifier} grpc-java io.grpc:protoc-gen-grpc-java:1.11.0:exe:${os.detected.classifier} kr.motd.maven os-maven-plugin 1.6.2
maven插件:
// 这个就是protobuf的中间文件// 指定的当前proto语法的版本,有2和3
syntax = "proto3";// 指定等会文件生成出来的package
package server;
option java_package = "com.plumelog.logback.rpc";
option java_multiple_files = false;
option java_outer_classname = "RpcClient";// 定义request model
message PlumelogRequest{string message = 1; // 1代表顺序
}// 定义response model
message PlumelogResponse{string message = 1; // 1代表顺序
}// 定义服务主体
service PlumelogService{// 定义方法rpc GetPlumelog(PlumelogRequest) returns(PlumelogResponse);
}
双击maven插件的protobuf:complie生成rpc代码,双击maven插件的protobuf:custom生成grpc代码
调用:
private String rpcHost = "127.0.0.1";private int rpcPort = 8899;ManagedChannel channel = ManagedChannelBuilder.forAddress(rpcHost, rpcPort).usePlaintext().build();@Overrideprotected void append(ILoggingEvent event) {if (event != null) {send(event);}}protected void send(ILoggingEvent event) {final BaseLogMessage logMessage = LogMessageUtil.getLogMessage(appName, env, event);if (logMessage instanceof RunLogMessage) {final String message = LogMessageUtil.getLogMessage(logMessage, event);PlumelogRpcClient.callRpcServer(channel, message);} }
package com.plumelog.logback.util;import com.plumelog.logback.rpc.PlumelogServiceGrpc;
import com.plumelog.logback.rpc.RpcClient;
import io.grpc.ManagedChannel;public class PlumelogRpcClient {public static void callRpcServer(ManagedChannel channel, String message) {RpcClient.PlumelogRequest request = RpcClient.PlumelogRequest.newBuilder().setMessage(message).build();try {PlumelogServiceGrpc.newBlockingStub(channel).getProductStock(request);} catch (Exception e) {e.printStackTrace();}}
}
服务引入log在本地 maven仓库的坐标,启动即可,前提是go 服务先启动。
下一篇:09-CSS-flex布局