Package org.opencv.dnn
Class Net
- java.lang.Object
- 
- org.opencv.dnn.Net
 
- 
 public class Net extends java.lang.ObjectThis class allows to create and manipulate comprehensive artificial neural networks. Neural network is presented as directed acyclic graph (DAG), where vertices are Layer instances, and edges specify relationships between layers inputs and outputs. Each network layer has unique integer id and unique string name inside its network. LayerId can store either layer name or layer id. This class supports reference counting of its instances, i. e. copies point to the same instance.
- 
- 
Field SummaryFields Modifier and Type Field Description protected longnativeObj
 - 
Method SummaryAll Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description static Net__fromPtr__(long addr)voidconnect(java.lang.String outPin, java.lang.String inpPin)Connects output of the first layer to input of the second layer.java.lang.Stringdump()Dump net to StringvoiddumpToFile(java.lang.String path)Dump net structure, hyperparameters, backend, target and fusion to dot filebooleanempty()Returns true if there are no layers in the network.voidenableFusion(boolean fusion)Enables or disables layer fusion in the network.protected voidfinalize()Matforward()Runs forward pass to compute output of layer with nameoutputName.Matforward(java.lang.String outputName)Runs forward pass to compute output of layer with nameoutputName.voidforward(java.util.List<Mat> outputBlobs)Runs forward pass to compute output of layer with nameoutputName.voidforward(java.util.List<Mat> outputBlobs, java.lang.String outputName)Runs forward pass to compute output of layer with nameoutputName.voidforward(java.util.List<Mat> outputBlobs, java.util.List<java.lang.String> outBlobNames)Runs forward pass to compute outputs of layers listed inoutBlobNames.longgetFLOPS(int layerId, java.util.List<MatOfInt> netInputShapes)longgetFLOPS(int layerId, MatOfInt netInputShape)longgetFLOPS(java.util.List<MatOfInt> netInputShapes)Computes FLOP for whole loaded model with specified input shapes.longgetFLOPS(MatOfInt netInputShape)LayergetLayer(DictValue layerId)Returns pointer to layer with specified id or name which the network use.intgetLayerId(java.lang.String layer)Converts string name of the layer to the integer identifier.java.util.List<java.lang.String>getLayerNames()intgetLayersCount(java.lang.String layerType)Returns count of layers of specified type.voidgetLayerTypes(java.util.List<java.lang.String> layersTypes)Returns list of types for layer used in model.voidgetMemoryConsumption(int layerId, java.util.List<MatOfInt> netInputShapes, long[] weights, long[] blobs)voidgetMemoryConsumption(int layerId, MatOfInt netInputShape, long[] weights, long[] blobs)voidgetMemoryConsumption(MatOfInt netInputShape, long[] weights, long[] blobs)longgetNativeObjAddr()MatgetParam(DictValue layer)Returns parameter blob of the layer.MatgetParam(DictValue layer, int numParam)Returns parameter blob of the layer.longgetPerfProfile(MatOfDouble timings)Returns overall time for inference and timings (in ticks) for layers.MatOfIntgetUnconnectedOutLayers()Returns indexes of layers with unconnected outputs.java.util.List<java.lang.String>getUnconnectedOutLayersNames()Returns names of layers with unconnected outputs.static NetreadFromModelOptimizer(java.lang.String xml, java.lang.String bin)Create a network from Intel's Model Optimizer intermediate representation (IR).static NetreadFromModelOptimizer(MatOfByte bufferModelConfig, MatOfByte bufferWeights)Create a network from Intel's Model Optimizer in-memory buffers with intermediate representation (IR).voidsetHalideScheduler(java.lang.String scheduler)Compile Halide layers.voidsetInput(Mat blob)Sets the new input value for the networkvoidsetInput(Mat blob, java.lang.String name)Sets the new input value for the networkvoidsetInput(Mat blob, java.lang.String name, double scalefactor)Sets the new input value for the networkvoidsetInput(Mat blob, java.lang.String name, double scalefactor, Scalar mean)Sets the new input value for the networkvoidsetInputShape(java.lang.String inputName, MatOfInt shape)Specify shape of network input.voidsetInputsNames(java.util.List<java.lang.String> inputBlobNames)Sets outputs names of the network input pseudo layer.voidsetParam(DictValue layer, int numParam, Mat blob)Sets the new value for the learned param of the layer.voidsetPreferableBackend(int backendId)Ask network to use specific computation backend where it supported.voidsetPreferableTarget(int targetId)Ask network to make computations on specific target device.
 
- 
- 
- 
Method Detail- 
getNativeObjAddrpublic long getNativeObjAddr() 
 - 
__fromPtr__public static Net __fromPtr__(long addr) 
 - 
readFromModelOptimizerpublic static Net readFromModelOptimizer(java.lang.String xml, java.lang.String bin) Create a network from Intel's Model Optimizer intermediate representation (IR).- Parameters:
- xml- XML configuration file with network's topology.
- bin- Binary file with trained weights. Networks imported from Intel's Model Optimizer are launched in Intel's Inference Engine backend.
- Returns:
- automatically generated
 
 - 
readFromModelOptimizerpublic static Net readFromModelOptimizer(MatOfByte bufferModelConfig, MatOfByte bufferWeights) Create a network from Intel's Model Optimizer in-memory buffers with intermediate representation (IR).- Parameters:
- bufferModelConfig- buffer with model's configuration.
- bufferWeights- buffer with model's trained weights.
- Returns:
- Net object.
 
 - 
emptypublic boolean empty() Returns true if there are no layers in the network.- Returns:
- automatically generated
 
 - 
dumppublic java.lang.String dump() Dump net to String- Returns:
- String with structure, hyperparameters, backend, target and fusion Call method after setInput(). To see correct backend, target and fusion run after forward().
 
 - 
dumpToFilepublic void dumpToFile(java.lang.String path) Dump net structure, hyperparameters, backend, target and fusion to dot file- Parameters:
- path- path to output file with .dot extension SEE: dump()
 
 - 
getLayerIdpublic int getLayerId(java.lang.String layer) Converts string name of the layer to the integer identifier.- Parameters:
- layer- automatically generated
- Returns:
- id of the layer, or -1 if the layer wasn't found.
 
 - 
getLayerNamespublic java.util.List<java.lang.String> getLayerNames() 
 - 
getLayerpublic Layer getLayer(DictValue layerId) Returns pointer to layer with specified id or name which the network use.- Parameters:
- layerId- automatically generated
- Returns:
- automatically generated
 
 - 
connectpublic void connect(java.lang.String outPin, java.lang.String inpPin)Connects output of the first layer to input of the second layer.- Parameters:
- outPin- descriptor of the first layer output.
- inpPin- descriptor of the second layer input. Descriptors have the following template <DFN><layer_name>[.input_number]</DFN>: - the first part of the template <DFN>layer_name</DFN> is string name of the added layer. If this part is empty then the network input pseudo layer will be used; - the second optional part of the template <DFN>input_number</DFN> is either number of the layer input, either label one. If this part is omitted then the first layer input will be used. SEE: setNetInputs(), Layer::inputNameToIndex(), Layer::outputNameToIndex()
 
 - 
setInputsNamespublic void setInputsNames(java.util.List<java.lang.String> inputBlobNames) Sets outputs names of the network input pseudo layer. Each net always has special own the network input pseudo layer with id=0. This layer stores the user blobs only and don't make any computations. In fact, this layer provides the only way to pass user data into the network. As any other layer, this layer can label its outputs and this function provides an easy way to do this.- Parameters:
- inputBlobNames- automatically generated
 
 - 
setInputShapepublic void setInputShape(java.lang.String inputName, MatOfInt shape)Specify shape of network input.- Parameters:
- inputName- automatically generated
- shape- automatically generated
 
 - 
forwardpublic Mat forward(java.lang.String outputName) Runs forward pass to compute output of layer with nameoutputName.- Parameters:
- outputName- name for layer which output is needed to get
- Returns:
- blob for first output of specified layer. By default runs forward pass for the whole network.
 
 - 
forwardpublic Mat forward() Runs forward pass to compute output of layer with nameoutputName.- Returns:
- blob for first output of specified layer. By default runs forward pass for the whole network.
 
 - 
forwardpublic void forward(java.util.List<Mat> outputBlobs, java.lang.String outputName) Runs forward pass to compute output of layer with nameoutputName.- Parameters:
- outputBlobs- contains all output blobs for specified layer.
- outputName- name for layer which output is needed to get If- outputNameis empty, runs forward pass for the whole network.
 
 - 
forwardpublic void forward(java.util.List<Mat> outputBlobs) Runs forward pass to compute output of layer with nameoutputName.- Parameters:
- outputBlobs- contains all output blobs for specified layer. If- outputNameis empty, runs forward pass for the whole network.
 
 - 
forwardpublic void forward(java.util.List<Mat> outputBlobs, java.util.List<java.lang.String> outBlobNames) Runs forward pass to compute outputs of layers listed inoutBlobNames.- Parameters:
- outputBlobs- contains blobs for first outputs of specified layers.
- outBlobNames- names for layers which outputs are needed to get
 
 - 
setHalideSchedulerpublic void setHalideScheduler(java.lang.String scheduler) Compile Halide layers.- Parameters:
- scheduler- Path to YAML file with scheduling directives. SEE: setPreferableBackend Schedule layers that support Halide backend. Then compile them for specific target. For layers that not represented in scheduling file or if no manual scheduling used at all, automatic scheduling will be applied.
 
 - 
setPreferableBackendpublic void setPreferableBackend(int backendId) Ask network to use specific computation backend where it supported.- Parameters:
- backendId- backend identifier. SEE: Backend If OpenCV is compiled with Intel's Inference Engine library, DNN_BACKEND_DEFAULT means DNN_BACKEND_INFERENCE_ENGINE. Otherwise it equals to DNN_BACKEND_OPENCV.
 
 - 
setPreferableTargetpublic void setPreferableTarget(int targetId) Ask network to make computations on specific target device.- Parameters:
- targetId- target identifier. SEE: Target List of supported combinations backend / target: | | DNN_BACKEND_OPENCV | DNN_BACKEND_INFERENCE_ENGINE | DNN_BACKEND_HALIDE | DNN_BACKEND_CUDA | |------------------------|--------------------|------------------------------|--------------------|-------------------| | DNN_TARGET_CPU | + | + | + | | | DNN_TARGET_OPENCL | + | + | + | | | DNN_TARGET_OPENCL_FP16 | + | + | | | | DNN_TARGET_MYRIAD | | + | | | | DNN_TARGET_FPGA | | + | | | | DNN_TARGET_CUDA | | | | + | | DNN_TARGET_CUDA_FP16 | | | | + | | DNN_TARGET_HDDL | | + | | |
 
 - 
setInputpublic void setInput(Mat blob, java.lang.String name, double scalefactor, Scalar mean) Sets the new input value for the network- Parameters:
- blob- A new blob. Should have CV_32F or CV_8U depth.
- name- A name of input layer.
- scalefactor- An optional normalization scale.
- mean- An optional mean subtraction values. SEE: connect(String, String) to know format of the descriptor. If scale or mean values are specified, a final input blob is computed as: \(input(n,c,h,w) = scalefactor \times (blob(n,c,h,w) - mean_c)\)
 
 - 
setInputpublic void setInput(Mat blob, java.lang.String name, double scalefactor) Sets the new input value for the network- Parameters:
- blob- A new blob. Should have CV_32F or CV_8U depth.
- name- A name of input layer.
- scalefactor- An optional normalization scale. SEE: connect(String, String) to know format of the descriptor. If scale or mean values are specified, a final input blob is computed as: \(input(n,c,h,w) = scalefactor \times (blob(n,c,h,w) - mean_c)\)
 
 - 
setInputpublic void setInput(Mat blob, java.lang.String name) Sets the new input value for the network- Parameters:
- blob- A new blob. Should have CV_32F or CV_8U depth.
- name- A name of input layer. SEE: connect(String, String) to know format of the descriptor. If scale or mean values are specified, a final input blob is computed as: \(input(n,c,h,w) = scalefactor \times (blob(n,c,h,w) - mean_c)\)
 
 - 
setInputpublic void setInput(Mat blob) Sets the new input value for the network- Parameters:
- blob- A new blob. Should have CV_32F or CV_8U depth. SEE: connect(String, String) to know format of the descriptor. If scale or mean values are specified, a final input blob is computed as: \(input(n,c,h,w) = scalefactor \times (blob(n,c,h,w) - mean_c)\)
 
 - 
setParampublic void setParam(DictValue layer, int numParam, Mat blob) Sets the new value for the learned param of the layer.- Parameters:
- layer- name or id of the layer.
- numParam- index of the layer parameter in the Layer::blobs array.
- blob- the new value. SEE: Layer::blobs Note: If shape of the new blob differs from the previous shape, then the following forward pass may fail.
 
 - 
getParampublic Mat getParam(DictValue layer, int numParam) Returns parameter blob of the layer.- Parameters:
- layer- name or id of the layer.
- numParam- index of the layer parameter in the Layer::blobs array. SEE: Layer::blobs
- Returns:
- automatically generated
 
 - 
getParampublic Mat getParam(DictValue layer) Returns parameter blob of the layer.- Parameters:
- layer- name or id of the layer. SEE: Layer::blobs
- Returns:
- automatically generated
 
 - 
getUnconnectedOutLayerspublic MatOfInt getUnconnectedOutLayers() Returns indexes of layers with unconnected outputs.- Returns:
- automatically generated
 
 - 
getUnconnectedOutLayersNamespublic java.util.List<java.lang.String> getUnconnectedOutLayersNames() Returns names of layers with unconnected outputs.- Returns:
- automatically generated
 
 - 
getFLOPSpublic long getFLOPS(java.util.List<MatOfInt> netInputShapes) Computes FLOP for whole loaded model with specified input shapes.- Parameters:
- netInputShapes- vector of shapes for all net inputs.
- Returns:
- computed FLOP.
 
 - 
getFLOPSpublic long getFLOPS(MatOfInt netInputShape) 
 - 
getFLOPSpublic long getFLOPS(int layerId, java.util.List<MatOfInt> netInputShapes)
 - 
getFLOPSpublic long getFLOPS(int layerId, MatOfInt netInputShape)
 - 
getLayerTypespublic void getLayerTypes(java.util.List<java.lang.String> layersTypes) Returns list of types for layer used in model.- Parameters:
- layersTypes- output parameter for returning types.
 
 - 
getLayersCountpublic int getLayersCount(java.lang.String layerType) Returns count of layers of specified type.- Parameters:
- layerType- type.
- Returns:
- count of layers
 
 - 
getMemoryConsumptionpublic void getMemoryConsumption(MatOfInt netInputShape, long[] weights, long[] blobs) 
 - 
getMemoryConsumptionpublic void getMemoryConsumption(int layerId, java.util.List<MatOfInt> netInputShapes, long[] weights, long[] blobs)
 - 
getMemoryConsumptionpublic void getMemoryConsumption(int layerId, MatOfInt netInputShape, long[] weights, long[] blobs)
 - 
enableFusionpublic void enableFusion(boolean fusion) Enables or disables layer fusion in the network.- Parameters:
- fusion- true to enable the fusion, false to disable. The fusion is enabled by default.
 
 - 
getPerfProfilepublic long getPerfProfile(MatOfDouble timings) Returns overall time for inference and timings (in ticks) for layers. Indexes in returned vector correspond to layers ids. Some layers can be fused with others, in this case zero ticks count will be return for that skipped layers.- Parameters:
- timings- vector for tick timings for all layers.
- Returns:
- overall ticks for model inference.
 
 - 
finalizeprotected void finalize() throws java.lang.Throwable- Overrides:
- finalizein class- java.lang.Object
- Throws:
- java.lang.Throwable
 
 
- 
 
-