Home
last modified time | relevance | path

Searched full:inference (Results 1 – 10 of 10) sorted by relevance

/linux/net/ipv4/
H A Dtcp_lp.c48 * @LP_WITHIN_INF: are we within inference?
71 * @inference: current inference
87 u32 inference; member
110 lp->inference = 0; in tcp_lp_init()
118 * Will only call newReno CA when away from inference.
281 /* calc inference */ in tcp_lp_pkts_acked()
284 lp->inference = 3 * delta; in tcp_lp_pkts_acked()
286 /* test if within inference */ in tcp_lp_pkts_acked()
287 if (lp->last_drop && (now - lp->last_drop < lp->inference)) in tcp_lp_pkts_acked()
308 * and will usually within threshold when within inference */ in tcp_lp_pkts_acked()
[all …]
/linux/Documentation/accel/
H A Dintroduction.rst19 - Edge AI - doing inference at an edge device. It can be an embedded ASIC/FPGA,
23 - Inference data-center - single/multi user devices in a large server. This
32 - Training data-center - Similar to Inference data-center cards, but typically
/linux/rust/pin-init/src/
H A D__internal.rs70 /// Type inference helper function.
80 /// inference help as `HasPinData`.
100 /// Type inference helper function.
H A Dmacros.rs233 //! // Ensure that `data` really is of type `PinData` and help with type inference:
1223 // Ensure that `data` really is of type `$data` and help with type inference:
1482 // get the correct type inference here:
1486 // We have to use type inference here to make zeroed have the correct type. This does
1510 // We abuse `slot` to get the correct type inference here:
/linux/drivers/accel/qaic/
H A DKconfig14 designed to accelerate Deep Learning inference workloads.
/linux/include/uapi/drm/
H A Divpu_accel.h169 * Device-unique inference ID (read-only)
310 * Performs Deep Learning Neural Compute Inference Operations
/linux/drivers/accel/ivpu/
H A Divpu_pm.c38 MODULE_PARM_DESC(inference_timeout_ms, "Inference maximum duration, in milliseconds, 0 - default");
199 vdev->timeout.inference; in ivpu_job_timeout_work()
H A Dvpu_jsm_api.h74 /* Job status returned when the job was preempted mid-inference */
/linux/tools/bpf/bpftool/
H A Dmain.h267 * past and which do not follow the string inference scheme that libbpf uses. in hashmap__empty()
/linux/arch/m68k/mac/
H A Dvia.c628 * State | vT1CH | VIA_TIMER_1_INT | inference drawn in mac_read_clk()