中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
首页
机构
成果
学者
登录
注册
登陆
×
验证码:
换一张
忘记密码?
记住我
×
校外用户登录
CAS IR Grid
机构
自动化研究所 [7]
计算技术研究所 [3]
地理科学与资源研究所 [2]
成都山地灾害与环境研... [1]
海洋研究所 [1]
重庆绿色智能技术研究... [1]
更多
采集方式
OAI收割 [15]
内容类型
期刊论文 [11]
学位论文 [2]
CNKI期刊论文 [1]
SCI/SSCI论文 [1]
发表日期
2024 [1]
2023 [1]
2022 [2]
2021 [3]
2020 [3]
2019 [1]
更多
学科主题
计算机科学技术 [1]
计算机科学技术::人... [1]
计算机科学技术::计... [1]
筛选
浏览/检索结果:
共15条,第1-10条
帮助
条数/页:
5
10
15
20
25
30
35
40
45
50
55
60
65
70
75
80
85
90
95
100
排序方式:
请选择
题名升序
题名降序
提交时间升序
提交时间降序
作者升序
作者降序
发表日期升序
发表日期降序
Influences of shallow groundwater depth on N 2 O diffusion along the soil profile of summer maize fields in North China Plain
期刊论文
OAI收割
SCIENCE OF THE TOTAL ENVIRONMENT, 2024, 卷号: 926, 页码: 171861
作者:
Li, Zhao
;
Li, Xurun
;
Zhang, Qiuying
;
Li, Fadong
;
Qiao, Yunfeng
  |  
收藏
  |  
浏览/下载:17/0
  |  
提交时间:2024/06/17
Shallow groundwater depth (SGD)
Soil profile
Nitrous oxide(N2O)
Lysimeter
Summer maize field
North China plain (NCP)
面向异构集群的分布式训练与优化算法
学位论文
OAI收割
2023
作者:
晁永越
  |  
收藏
  |  
浏览/下载:23/0
  |  
提交时间:2023/06/19
分布式深度学习
异构集群
任务分配
异步SGD算法
Fast and accurate variable batch size convolution neural network training on large scale distributed systems
期刊论文
OAI收割
CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2022, 页码: 26
作者:
Hu, Zhongzhe
;
Xiao, Junmin
;
Sun, Ninghui
;
Tan, Guangming
  |  
收藏
  |  
浏览/下载:30/0
  |  
提交时间:2022/12/07
deep learning
distributed computing
ImageNet-1K
large-batch training
synchronous SGD
Towards Better Generalization of Deep Neural Networks via Non-Typicality Sampling Scheme
期刊论文
OAI收割
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 页码: 11
作者:
Peng, Xinyu
;
Wang, Fei-Yue
;
Li, Li
  |  
收藏
  |  
浏览/下载:29/0
  |  
提交时间:2022/06/06
Training
Estimation
Deep learning
Standards
Optimization
Noise measurement
Convergence
Deep learning
generalization performance
nontypicality sampling scheme
stochastic gradient descent (SGD)
Drill the Cork of Information Bottleneck by Inputting the Most Important Data
期刊论文
OAI收割
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 页码: 13
作者:
Peng, Xinyu
;
Zhang, Jiawei
;
Wang, Fei-Yue
;
Li, Li
  |  
收藏
  |  
浏览/下载:14/0
  |  
提交时间:2022/01/27
Training
Signal to noise ratio
Mutual information
Optimization
Convergence
Deep learning
Tools
Information bottleneck (IB) theory
machine learning
minibatch stochastic gradient descent (SGD)
typicality sampling
Efficient and High-quality Recommendations via Momentum-incorporated Parallel Stochastic Gradient Descent-Based Learning
期刊论文
OAI收割
IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2021, 卷号: 8, 期号: 2, 页码: 402-411
作者:
Luo, Xin
;
Qin, Wen
;
Dong, Ani
;
Sedraoui, Khaled
;
Zhou, MengChu
  |  
收藏
  |  
浏览/下载:24/0
  |  
提交时间:2021/03/17
Big data
industrial application
industrial data
latent factor analysis
machine learning
parallel algorithm
recommender system (RS)
stochastic gradient descent (SGD)
Efficient and High-quality Recommendations via Momentum-incorporated Parallel Stochastic Gradient Descent-Based Learning
期刊论文
OAI收割
IEEE/CAA Journal of Automatica Sinica, 2021, 卷号: 8, 期号: 2, 页码: 402-411
作者:
Xin Luo
;
Wen Qin
;
Ani Dong
;
Khaled Sedraoui
;
MengChu Zhou
  |  
收藏
  |  
浏览/下载:16/0
  |  
提交时间:2021/04/09
Big data
industrial application
industrial data
latent factor analysis
machine learning
parallel algorithm
recommender system (RS)
stochastic gradient descent (SGD)
WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system
期刊论文
OAI收割
JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2020, 卷号: 145, 页码: 202-216
作者:
Cheng Daning
;
Li Shigang
;
Zhang Yunquan
  |  
收藏
  |  
浏览/下载:37/0
  |  
提交时间:2020/12/10
SGD
Unbalanced workload
SimuParallel SGD
Distributed system
Accelerating Minibatch Stochastic Gradient Descent Using Typicality Sampling
期刊论文
OAI收割
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 卷号: 31, 期号: 11, 页码: 4649-4659
作者:
Peng, Xinyu
;
Li, Li
;
Wang, Fei-Yue
  |  
收藏
  |  
浏览/下载:21/0
  |  
提交时间:2021/01/06
Training
Convergence
Approximation algorithms
Stochastic processes
Estimation
Optimization
Acceleration
Batch selection
machine learning
minibatch stochastic gradient descent (SGD)
speed of convergence
Primal Averaging: A New Gradient Evaluation Step to Attain the Optimal Individual Convergence
期刊论文
OAI收割
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 卷号: 50, 期号: 2, 页码: 835-845
作者:
Tao, Wei
;
Pan, Zhisong
;
Wu, Gaowei
;
Tao, Qing
  |  
收藏
  |  
浏览/下载:33/0
  |  
提交时间:2020/03/30
Convergence
Convex functions
Machine learning
Optimization methods
Linear programming
Cybernetics
Individual convergence
machine learning
mirror descent (MD) methods
regularized learning problems
stochastic gradient descent (SGD)
stochastic optimization