InferenceInput
class InferenceInput
kotlin.Any | |
↳ | android.adservices.ondevicepersonalization.InferenceInput |
Contains all the information needed for a run of model inference. The input of android.adservices.ondevicepersonalization.ModelManager#run
.
Summary
Nested classes | |
---|---|
A builder for |
|
Public methods | |
---|---|
Boolean |
Indicates whether some other object is "equal to" this one. |
Int |
The number of input examples. |
InferenceOutput |
The empty InferenceOutput representing the expected output structure. |
Array<Any!> |
Note: use android. |
InferenceInput.Params |
The configuration that controls runtime interpreter behavior. |
Int |
hashCode() |
Public methods
equals
fun equals(other: Any?): Boolean
Indicates whether some other object is "equal to" this one.
The equals
method implements an equivalence relation on non-null object references:
- It is reflexive: for any non-null reference value
x
,x.equals(x)
should returntrue
. - It is symmetric: for any non-null reference values
x
andy
,x.equals(y)
should returntrue
if and only ify.equals(x)
returnstrue
. - It is transitive: for any non-null reference values
x
,y
, andz
, ifx.equals(y)
returnstrue
andy.equals(z)
returnstrue
, thenx.equals(z)
should returntrue
. - It is consistent: for any non-null reference values
x
andy
, multiple invocations ofx.equals(y)
consistently returntrue
or consistently returnfalse
, provided no information used inequals
comparisons on the objects is modified. - For any non-null reference value
x
,x.equals(null)
should returnfalse
.
An equivalence relation partitions the elements it operates on into equivalence classes; all the members of an equivalence class are equal to each other. Members of an equivalence class are substitutable for each other, at least for some purposes.
Parameters | |
---|---|
obj |
the reference object with which to compare. |
o |
This value may be null . |
Return | |
---|---|
Boolean |
true if this object is the same as the obj argument; false otherwise. |
getBatchSize
fun getBatchSize(): Int
The number of input examples. Adopter can set this field to run batching inference. The batch size is 1 by default. The batch size should match the input data size.
getExpectedOutputStructure
fun getExpectedOutputStructure(): InferenceOutput
The empty InferenceOutput representing the expected output structure. For LiteRT, the inference code will verify whether this expected output structure matches model output signature.
If a model produce string tensors:
<code>String[] output = new String[3][2]; // Output tensor shape is [3, 2]. HashMap<Integer, Object> outputs = new HashMap<>(); outputs.put(0, output); expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build(); </code>
Return | |
---|---|
InferenceOutput |
This value cannot be null . |
getInputData
fun getInputData(): Array<Any!>
Note: use android.adservices.ondevicepersonalization.InferenceInput#getData() instead.
An array of input data. The inputs should be in the same order as inputs of the model.
For example, if a model takes multiple inputs:
<code>String[] input0 = {"foo", "bar"}; // string tensor shape is [2]. int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3]. Object[] inputData = {input0, input1, ...}; </code>
Return | |
---|---|
Array<Any!> |
This value cannot be null . |
getParams
fun getParams(): InferenceInput.Params
The configuration that controls runtime interpreter behavior.
Return | |
---|---|
InferenceInput.Params |
This value cannot be null . |