| Year | Rank | Type | Title / Venue / Authors |
|---|---|---|---|
| 2026 | J | jnl |
ScoutAttention: Efficient KV Cache Offloading via Layer-Ahead CPU Pre-computation for LLM Inference.
CoRR
|
| 2025 | B | conf |
TrustCom
|
| 2025 | J | jnl |
CoRR
|
| 2025 | J | jnl |
Eng. Appl. Artif. Intell.
|
| 2025 | J | jnl |
CoRR
|
| 2025 | A* | conf |
INFOCOM
|
| 2024 | J | jnl |
CoRR
|
| 2023 | J | jnl |
IEEE Trans. Mob. Comput.
|
| 2023 | J | jnl |
Sensors
|
| 2022 | J | jnl |
Medical Biol. Eng. Comput.
|
| 2022 | J | jnl |
Neurocomputing
|
| 2022 | A* | conf |
OSDI
|
| 2020 | J | jnl |
J. Sensors
|
| 2018 | B | conf |
WCNC
|
| 2018 | J | jnl |
Genom. Proteom. Bioinform.
|
| 2017 | — | conf |
5GWN
|
| 2016 | J | jnl |
Secur. Commun. Networks
|
| 2015 | B | conf |
LCN
|
| 2015 | B | conf |
LCN
|
| 2015 | — | conf |
ICC
|
| 2014 | J | jnl |
J. Netw. Comput. Appl.
|
| 2014 | B | conf |
ICCCN
|
| 2012 | B | conf |
PRICAI
|
| 2008 | — | conf |
CSSE (2)
|