Skip to content

[Abandoned] HADOOP-16612 Track Azure Blob File System client-perceived latency #1569

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 67 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
67 commits
Select commit Hold shift + click to select a range
4d25cc9
track ADLS end-to-end latency
jemangwa Sep 30, 2019
770adc5
HADOOP-16578 : Avoid FileSystem API calls when FileSystem already exists
snvijaya Oct 2, 2019
d1ddba6
YARN-9862. Increase yarn-services-core test timeout value.
macroadster Oct 2, 2019
3df733c
HDFS-14885. UI: Fix a typo on WebUI of DataNode. Contributed by Xiemi…
aajisaka Oct 2, 2019
41440ec
HDDS-2210. ContainerStateMachine should not be marked unhealthy if ap…
bshashikant Oct 2, 2019
f1ba9bf
HDDS-2187. ozone-mr test fails with No FileSystem for scheme "o3fs"
adoroszlai Oct 2, 2019
61a8436
YARN-9870. Remove unused function from OpportunisticContainerAllocato…
abmodi Oct 2, 2019
2e1fd44
HDDS-2201. Rename VolumeList to UserVolumeInfo. (#1566)
anuengineer Oct 2, 2019
0d2d6f9
YARN-9792. Document examples of SchedulerConf with Node Labels. Contr…
sunilgovind Oct 2, 2019
e8ae632
HDDS-2068. Make StorageContainerDatanodeProtocolService message based
elek Sep 23, 2019
ffd4e52
HDDS-2073. Make SCMSecurityProtocol message based.
anuengineer Oct 2, 2019
685918e
HDDS-2227. GDPR key generation could benefit from secureRandom. (#1574)
anuengineer Oct 2, 2019
169cef7
HDDS-2162. Make OM Generic related configuration support HA style con…
bharatviswa504 Oct 2, 2019
53ed78b
HDDS-2224. Fix loadup cache for cache cleanup policy NEVER. (#1567)
bharatviswa504 Oct 2, 2019
b09d389
HDDS-2019. Handle Set DtService of token in S3Gateway for OM HA. (#1489)
bharatviswa504 Oct 2, 2019
559ee27
HADOOP-16599. Allow a SignerInitializer to be specified along with a …
sidseth Oct 2, 2019
1303255
HDFS-14858. [SBN read] Allow configurably enable/disable AlignmentCon…
Oct 2, 2019
4c24f24
HDDS-2072. Make StorageContainerLocationProtocolService message based
anuengineer Oct 2, 2019
c5665b2
HDDS-2228. Fix NPE in OzoneDelegationTokenManager#addPersistedDelegat…
xiaoyuyao Oct 3, 2019
0e026cb
HADOOP-16620. [pb-upgrade] Remove protocol buffers 3.7.1 from require…
aajisaka Oct 3, 2019
c19fa3d
HADOOP-16605. Fix testcase testSSLChannelModeConfig
snvijaya Oct 3, 2019
5a7483c
HDFS-14888. RBF: Enable Parallel Test Profile for builds. Contributed…
ayushtkn Oct 3, 2019
d59bcbf
HDDS-2226. S3 Secrets should use a strong RNG. (#1572)
anuengineer Oct 3, 2019
a3fe404
HDFS-14881. Safemode 'forceExit' option, doesn’t shown in help messag…
ayushtkn Oct 3, 2019
51eaeca
HDDS-2211. Collect docker logs if env fails to start (#1553)
adoroszlai Oct 3, 2019
47d721d
HDDS-2234. rat.sh fails due to ozone-recon-web/build files (#1580)
adoroszlai Oct 3, 2019
9446686
HDDS-2231. test-single.sh cannot copy results (#1575)
adoroszlai Oct 3, 2019
76605f1
HDDS-1720 : Add ability to configure RocksDB logs for Ozone Manager.
Sep 27, 2019
b7cb8fe
HDDS-2200 : Recon does not handle the NULL snapshot from OM DB cleanly.
Oct 2, 2019
1dde3ef
HADOOP-16624. Upgrade hugo to the latest version in Dockerfile
pingsutw Oct 2, 2019
cdaa480
HDDS-2198. SCM should not consider containers in CLOSING state to com…
nandakumar131 Oct 4, 2019
9700e20
HDDS-2223. Support ReadWrite lock in LockManager. (#1564)
nandakumar131 Oct 4, 2019
844b766
HDFS-14889. Ability to check if a block has a replica on provided sto…
virajith Oct 4, 2019
c99a121
HDFS-14637. Namenode may not replicate blocks to meet the policy afte…
Oct 4, 2019
ec8f691
HDDS-2225. SCM fails to start in most unsecure environments due to le…
adoroszlai Oct 4, 2019
b23bdaf
HDFS-14879. Header was wrong in Snapshot web UI. Contributed by heman…
tasanuma Oct 4, 2019
2478cba
YARN-9782. Avoid DNS resolution while running SLS. Contributed by Abh…
abmodi Oct 4, 2019
4cf0b36
HDDS-2222 (#1578)
szetszwo Oct 4, 2019
a9849f6
Revert "HDDS-2222 (#1578)" (#1594)
szetszwo Oct 4, 2019
bffcd33
HDDS-2230. Invalid entries in ozonesecure-mr config
elek Oct 4, 2019
d061c84
HDDS-2140. Add robot test for GDPR feature
dineshchitlangia Oct 4, 2019
6171a41
HDDS-2199. In SCMNodeManager dnsToUuidMap cannot track multiple DNs o…
Oct 4, 2019
bca014b
HDDS-2216. Rename HADOOP_RUNNER_VERSION to OZONE_RUNNER_VERSION in co…
cxorm Oct 3, 2019
f44abc3
HADOOP-16207 Improved S3A MR tests.
steveloughran Oct 4, 2019
531cc93
HDDS-2222. Add a method to update ByteBuffer in PureJavaCrc32/PureJav…
szetszwo Oct 4, 2019
f826420
HDDS-2230. Invalid entries in ozonesecure-mr config. (Addendum)
adoroszlai Oct 4, 2019
4510970
YARN-9873. Mutation API Config Change updates Version Number. Contrib…
sunilgovind Oct 4, 2019
3f16651
HDDS-2237. KeyDeletingService throws NPE if it's started too early (#…
elek Oct 4, 2019
aa24add
HDFS-14890. Fixed namenode and journalnode startup on Windows.
macroadster Oct 4, 2019
6574f27
HADOOP-16570. S3A committers encounter scale issues.
steveloughran Oct 4, 2019
10bdc59
HADOOP-16579. Upgrade to Apache Curator 4.2.0 excluding ZK (#1531). C…
nkalmar Oct 4, 2019
f3eaa84
HDDS-2164 : om.db.checkpoints is getting filling up fast. (#1536)
avijayanhwx Oct 4, 2019
8de4374
HDDS-2158. Fixing Json Injection Issue in JsonUtils. (#1486)
hanishakoneru Oct 4, 2019
a3cf54c
HDDS-2250. Generated configs missing from ozone-filesystem-lib jars
adoroszlai Oct 4, 2019
f209722
HDDS-2257. Fix checkstyle issues in ChecksumByteBuffer (#1603)
vivekratnavel Oct 4, 2019
fb1ecff
Revert "YARN-9873. Mutation API Config Change updates Version Number.…
sunilgovind Oct 5, 2019
579dc87
HDDS-2251. Add an option to customize unit.sh and integration.sh para…
elek Oct 5, 2019
b8086bf
HADOOP-16626. S3A ITestRestrictedReadAccess fails without S3Guard.
steveloughran Oct 5, 2019
55c5436
Revert "HADOOP-16579. Upgrade to Apache Curator 4.2.0 excluding ZK (#…
jojochuang Oct 5, 2019
022fe5f
HDDS-2169. Avoid buffer copies while submitting client requests in Ra…
bshashikant Oct 7, 2019
14cd969
HADOOP-16512. [hadoop-tools] Fix order of actual and expected express…
pingsutw Sep 3, 2019
7f332eb
HDDS-2252. Enable gdpr robot test in daily build
dineshchitlangia Oct 7, 2019
1a77a15
HADOOP-16587. Make ABFS AAD endpoints configurable.
bilaharith Oct 7, 2019
9685a6c
HDDS-2239. Fix TestOzoneFsHAUrls (#1600)
adoroszlai Oct 7, 2019
382967b
HDFS-14373. EC : Decoding is failing when block group last incomplete…
surendralilhore Oct 7, 2019
1850652
track ADLS end-to-end latency
jemangwa Sep 30, 2019
2b49e3b
Merge branch 'HADOOP-16612' of https://github.com/jeeteshm/hadoop int…
jemangwa Oct 7, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
31 changes: 0 additions & 31 deletions BUILDING.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ Requirements:
* Unix System
* JDK 1.8
* Maven 3.3 or later
* ProtocolBuffer 3.7.1
* CMake 3.1 or newer (if compiling native code)
* Zlib devel (if compiling native code)
* Cyrus SASL devel (if compiling native code)
Expand Down Expand Up @@ -62,16 +61,6 @@ Installing required packages for clean install of Ubuntu 14.04 LTS Desktop:
$ sudo apt-get -y install maven
* Native libraries
$ sudo apt-get -y install build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev libsasl2-dev
* ProtocolBuffer 3.7.1 (required)
$ mkdir -p /opt/protobuf-3.7-src \
&& curl -L -s -S \
https://github.com/protocolbuffers/protobuf/releases/download/v3.7.1/protobuf-java-3.7.1.tar.gz \
-o /opt/protobuf-3.7.1.tar.gz \
&& tar xzf /opt/protobuf-3.7.1.tar.gz --strip-components 1 -C /opt/protobuf-3.7-src \
&& cd /opt/protobuf-3.7-src \
&& ./configure\
&& make install \
&& rm -rf /opt/protobuf-3.7-src

Optional packages:

Expand Down Expand Up @@ -310,16 +299,6 @@ level once; and then work from the submodule. Keep in mind that SNAPSHOTs
time out after a while, using the Maven '-nsu' will stop Maven from trying
to update SNAPSHOTs from external repos.

----------------------------------------------------------------------------------
Protocol Buffer compiler

The version of Protocol Buffer compiler, protoc, must match the version of the
protobuf JAR.

If you have multiple versions of protoc in your system, you can set in your
build shell the HADOOP_PROTOC_PATH environment variable to point to the one you
want to use for the Hadoop build. If you don't define this environment variable,
protoc is looked up in the PATH.
----------------------------------------------------------------------------------
Importing projects to eclipse

Expand Down Expand Up @@ -405,15 +384,6 @@ Installing required dependencies for clean install of macOS 10.14:
* Install native libraries, only openssl is required to compile native code,
you may optionally install zlib, lz4, etc.
$ brew install openssl
* Protocol Buffers 3.7.1 (required)
$ wget https://github.com/protocolbuffers/protobuf/releases/download/v3.7.1/protobuf-java-3.7.1.tar.gz
$ mkdir -p protobuf-3.7 && tar zxvf protobuf-java-3.7.1.tar.gz --strip-components 1 -C protobuf-3.7
$ cd protobuf-3.7
$ ./configure
$ make
$ make check
$ make install
$ protoc --version

Note that building Hadoop 3.1.1/3.1.2/3.2.0 native code from source is broken
on macOS. For 3.1.1/3.1.2, you need to manually backport YARN-8622. For 3.2.0,
Expand All @@ -439,7 +409,6 @@ Requirements:
* Windows System
* JDK 1.8
* Maven 3.0 or later
* ProtocolBuffer 3.7.1
* CMake 3.1 or newer
* Visual Studio 2010 Professional or Higher
* Windows SDK 8.1 (if building CPU rate control for the container executor)
Expand Down
19 changes: 1 addition & 18 deletions dev-support/docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -105,23 +105,6 @@ RUN mkdir -p /opt/cmake \
ENV CMAKE_HOME /opt/cmake
ENV PATH "${PATH}:/opt/cmake/bin"

######
# Install Google Protobuf 3.7.1 (2.6.0 ships with Xenial)
######
# hadolint ignore=DL3003
RUN mkdir -p /opt/protobuf-src \
&& curl -L -s -S \
https://github.com/protocolbuffers/protobuf/releases/download/v3.7.1/protobuf-java-3.7.1.tar.gz \
-o /opt/protobuf.tar.gz \
&& tar xzf /opt/protobuf.tar.gz --strip-components 1 -C /opt/protobuf-src \
&& cd /opt/protobuf-src \
&& ./configure --prefix=/opt/protobuf \
&& make install \
&& cd /root \
&& rm -rf /opt/protobuf-src
ENV PROTOBUF_HOME /opt/protobuf
ENV PATH "${PATH}:/opt/protobuf/bin"

######
# Install Apache Maven 3.3.9 (3.3.9 ships with Xenial)
######
Expand Down Expand Up @@ -207,7 +190,7 @@ ENV MAVEN_OPTS -Xms256m -Xmx1536m
###

# Hugo static website generator (for new hadoop site and Ozone docs)
RUN curl -L -o hugo.deb https://github.com/gohugoio/hugo/releases/download/v0.30.2/hugo_0.30.2_Linux-64bit.deb \
RUN curl -L -o hugo.deb https://github.com/gohugoio/hugo/releases/download/v0.58.3/hugo_0.58.3_Linux-64bit.deb \
&& dpkg --install hugo.deb \
&& rm hugo.deb

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@
import org.apache.hadoop.hdds.protocol.datanode.proto.ContainerProtos.ContainerCommandRequestProto;
import org.apache.hadoop.hdds.protocol.datanode.proto.ContainerProtos.ContainerCommandResponseProto;
import org.apache.hadoop.hdds.protocol.proto.HddsProtos;
import org.apache.hadoop.hdds.ratis.ContainerCommandRequestMessage;
import org.apache.hadoop.hdds.scm.client.HddsClientUtils;
import org.apache.hadoop.hdds.scm.pipeline.Pipeline;
import org.apache.hadoop.hdds.security.x509.SecurityConfig;
Expand All @@ -56,7 +57,6 @@
import org.apache.ratis.retry.RetryPolicy;
import org.apache.ratis.rpc.RpcType;
import org.apache.ratis.rpc.SupportedRpcType;
import org.apache.ratis.thirdparty.com.google.protobuf.ByteString;
import org.apache.ratis.thirdparty.com.google.protobuf.InvalidProtocolBufferException;
import org.apache.ratis.util.TimeDuration;
import org.slf4j.Logger;
Expand Down Expand Up @@ -219,39 +219,16 @@ private CompletableFuture<RaftClientReply> sendRequestAsync(
try (Scope scope = GlobalTracer.get()
.buildSpan("XceiverClientRatis." + request.getCmdType().name())
.startActive(true)) {
ContainerCommandRequestProto finalPayload =
ContainerCommandRequestProto.newBuilder(request)
.setTraceID(TracingUtil.exportCurrentSpan())
.build();
boolean isReadOnlyRequest = HddsUtils.isReadOnly(finalPayload);
ByteString byteString = finalPayload.toByteString();
if (LOG.isDebugEnabled()) {
LOG.debug("sendCommandAsync {} {}", isReadOnlyRequest,
sanitizeForDebug(finalPayload));
final ContainerCommandRequestMessage message
= ContainerCommandRequestMessage.toMessage(
request, TracingUtil.exportCurrentSpan());
if (HddsUtils.isReadOnly(request)) {
LOG.debug("sendCommandAsync ReadOnly {}", message);
return getClient().sendReadOnlyAsync(message);
} else {
LOG.debug("sendCommandAsync {}", message);
return getClient().sendAsync(message);
}
return isReadOnlyRequest ?
getClient().sendReadOnlyAsync(() -> byteString) :
getClient().sendAsync(() -> byteString);
}
}

private ContainerCommandRequestProto sanitizeForDebug(
ContainerCommandRequestProto request) {
switch (request.getCmdType()) {
case PutSmallFile:
return request.toBuilder()
.setPutSmallFile(request.getPutSmallFile().toBuilder()
.clearData()
)
.build();
case WriteChunk:
return request.toBuilder()
.setWriteChunk(request.getWriteChunk().toBuilder()
.clearData()
)
.build();
default:
return request;
}
}

Expand Down
5 changes: 5 additions & 0 deletions hadoop-hdds/common/dev-support/findbugsExcludeFile.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,9 @@
<Class name="org.apache.hadoop.hdds.cli.GenericCli"></Class>
<Bug pattern="DM_EXIT" />
</Match>
<Match>
<Class name="org.apache.hadoop.ozone.common.ChecksumByteBuffer$CrcIntTable" />
<Method name="update" />
<Bug pattern="SF_SWITCH_FALLTHROUGH,SF_SWITCH_NO_DEFAULT" />
</Match>
</FindBugsFilter>
Original file line number Diff line number Diff line change
Expand Up @@ -16,22 +16,29 @@
*/
package org.apache.hadoop.hdds.protocolPB;

import com.google.protobuf.RpcController;
import com.google.protobuf.ServiceException;
import java.io.Closeable;
import java.io.IOException;
import java.util.function.Consumer;

import org.apache.hadoop.hdds.protocol.SCMSecurityProtocol;
import org.apache.hadoop.hdds.protocol.proto.HddsProtos.DatanodeDetailsProto;
import org.apache.hadoop.hdds.protocol.proto.HddsProtos.OzoneManagerDetailsProto;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMGetCACertificateRequestProto;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMGetCertResponseProto;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMGetCertificateRequestProto;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMGetCertificateRequestProto.Builder;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMGetDataNodeCertRequestProto;
import org.apache.hadoop.hdds.protocol.SCMSecurityProtocol;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMSecurityRequest;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMSecurityRequest.Builder;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMSecurityResponse;
import org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.Type;
import org.apache.hadoop.hdds.tracing.TracingUtil;
import org.apache.hadoop.ipc.ProtobufHelper;
import org.apache.hadoop.ipc.ProtocolTranslator;
import org.apache.hadoop.ipc.RPC;

import com.google.protobuf.RpcController;
import com.google.protobuf.ServiceException;
import static org.apache.hadoop.hdds.protocol.proto.SCMSecurityProtocolProtos.SCMGetOMCertRequestProto;

/**
Expand All @@ -52,6 +59,28 @@ public SCMSecurityProtocolClientSideTranslatorPB(
this.rpcProxy = rpcProxy;
}

/**
* Helper method to wrap the request and send the message.
*/
private SCMSecurityResponse submitRequest(
SCMSecurityProtocolProtos.Type type,
Consumer<Builder> builderConsumer) throws IOException {
final SCMSecurityResponse response;
try {

Builder builder = SCMSecurityRequest.newBuilder()
.setCmdType(type)
.setTraceID(TracingUtil.exportCurrentSpan());
builderConsumer.accept(builder);
SCMSecurityRequest wrapper = builder.build();

response = rpcProxy.submitRequest(NULL_RPC_CONTROLLER, wrapper);
} catch (ServiceException ex) {
throw ProtobufHelper.getRemoteException(ex);
}
return response;
}

/**
* Closes this stream and releases any system resources associated
* with it. If the stream is already closed then invoking this
Expand Down Expand Up @@ -87,8 +116,8 @@ public String getDataNodeCertificate(DatanodeDetailsProto dataNodeDetails,
/**
* Get SCM signed certificate for OM.
*
* @param omDetails - OzoneManager Details.
* @param certSignReq - Certificate signing request.
* @param omDetails - OzoneManager Details.
* @param certSignReq - Certificate signing request.
* @return byte[] - SCM signed certificate.
*/
@Override
Expand All @@ -100,64 +129,61 @@ public String getOMCertificate(OzoneManagerDetailsProto omDetails,
/**
* Get SCM signed certificate for OM.
*
* @param omDetails - OzoneManager Details.
* @param certSignReq - Certificate signing request.
* @param omDetails - OzoneManager Details.
* @param certSignReq - Certificate signing request.
* @return byte[] - SCM signed certificate.
*/
public SCMGetCertResponseProto getOMCertChain(
OzoneManagerDetailsProto omDetails, String certSignReq)
throws IOException {
SCMGetOMCertRequestProto.Builder builder = SCMGetOMCertRequestProto
SCMGetOMCertRequestProto request = SCMGetOMCertRequestProto
.newBuilder()
.setCSR(certSignReq)
.setOmDetails(omDetails);
try {
return rpcProxy.getOMCertificate(NULL_RPC_CONTROLLER, builder.build());
} catch (ServiceException e) {
throw ProtobufHelper.getRemoteException(e);
}
.setOmDetails(omDetails)
.build();
return submitRequest(Type.GetOMCertificate,
builder -> builder.setGetOMCertRequest(request))
.getGetCertResponseProto();
}

/**
* Get SCM signed certificate with given serial id. Throws exception if
* certificate is not found.
*
* @param certSerialId - Certificate serial id.
* @param certSerialId - Certificate serial id.
* @return string - pem encoded certificate.
*/
@Override
public String getCertificate(String certSerialId) throws IOException {
Builder builder = SCMGetCertificateRequestProto
SCMGetCertificateRequestProto request = SCMGetCertificateRequestProto
.newBuilder()
.setCertSerialId(certSerialId);
try {
return rpcProxy.getCertificate(NULL_RPC_CONTROLLER, builder.build())
.getX509Certificate();
} catch (ServiceException e) {
throw ProtobufHelper.getRemoteException(e);
}
.setCertSerialId(certSerialId)
.build();
return submitRequest(Type.GetCertificate,
builder -> builder.setGetCertificateRequest(request))
.getGetCertResponseProto()
.getX509Certificate();
}

/**
* Get SCM signed certificate for Datanode.
*
* @param dnDetails - Datanode Details.
* @param certSignReq - Certificate signing request.
* @param dnDetails - Datanode Details.
* @param certSignReq - Certificate signing request.
* @return byte[] - SCM signed certificate.
*/
public SCMGetCertResponseProto getDataNodeCertificateChain(
DatanodeDetailsProto dnDetails, String certSignReq)
throws IOException {
SCMGetDataNodeCertRequestProto.Builder builder =

SCMGetDataNodeCertRequestProto request =
SCMGetDataNodeCertRequestProto.newBuilder()
.setCSR(certSignReq)
.setDatanodeDetails(dnDetails);
try {
return rpcProxy.getDataNodeCertificate(NULL_RPC_CONTROLLER,
builder.build());
} catch (ServiceException e) {
throw ProtobufHelper.getRemoteException(e);
}
.setDatanodeDetails(dnDetails)
.build();
return submitRequest(Type.GetDataNodeCertificate,
builder -> builder.setGetDataNodeCertRequest(request))
.getGetCertResponseProto();
}

/**
Expand All @@ -169,12 +195,10 @@ public SCMGetCertResponseProto getDataNodeCertificateChain(
public String getCACertificate() throws IOException {
SCMGetCACertificateRequestProto protoIns = SCMGetCACertificateRequestProto
.getDefaultInstance();
try {
return rpcProxy.getCACertificate(NULL_RPC_CONTROLLER, protoIns)
.getX509Certificate();
} catch (ServiceException e) {
throw ProtobufHelper.getRemoteException(e);
}
return submitRequest(Type.GetCACertificate,
builder -> builder.setGetCACertificateRequest(protoIns))
.getGetCertResponseProto().getX509Certificate();

}

/**
Expand Down
Loading