Searched full:inference (Results 1 – 8 of 8) sorted by relevance
48 * @LP_WITHIN_INF: are we within inference?71 * @inference: current inference87 u32 inference; member 110 lp->inference = 0; in tcp_lp_init() 120 * Will only call newReno CA when away from inference.284 /* calc inference */ in tcp_lp_pkts_acked() 287 lp->inference = 3 * delta; in tcp_lp_pkts_acked() 289 /* test if within inference */ in tcp_lp_pkts_acked() 290 if (lp->last_drop && (now - lp->last_drop < lp->inference)) in tcp_lp_pkts_acked() [all...]
19 - Edge AI - doing inference at an edge device. It can be an embedded ASIC/FPGA,23 - Inference data-center - single/multi user devices in a large server. This32 - Training data-center - Similar to Inference data-center cards, but typically
70 /// Type inference helper function.80 /// inference help as `HasPinData`.100 /// Type inference helper function.
4 /// type inference to figure out a return type for those tokens.
268 /// Functions default to `()` and closures default to type inference.
628 * State | vT1CH | VIA_TIMER_1_INT | inference drawn in mac_read_clk()
1659 "asic doesn't allow inference soft reset - do hard-reset instead\n"); in hl_device_reset()1730 dev_dbg(hdev->dev, "Going to reset engines of inference device\n"); in hl_device_reset()