Reproducible Model Sharing for AI Practitioners
暂无分享,去创建一个
[1] Ana Trisovic,et al. Advancing Computational Reproducibility in the Dataverse Data Repository Platform , 2020, P-RECS@HPDC.
[2] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[3] Alexandru Iosup,et al. Is Big Data Performance Reproducible in Modern Cloud Networks? , 2019, NSDI.
[4] Mark Chen,et al. Language Models are Few-Shot Learners , 2020, NeurIPS.
[5] Vatche Ishakian,et al. Serving Deep Learning Models in a Serverless Platform , 2017, 2018 IEEE International Conference on Cloud Engineering (IC2E).
[6] Micah Goldblum,et al. Analyzing the Machine Learning Conference Review Process , 2020, ArXiv.
[7] 拓海 杉山,et al. “Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks”の学習報告 , 2017 .
[8] Serjik G. Dikaleh,et al. Modernize digital applications with microservices management using the istio service mesh , 2018, CASCON.
[9] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[10] Clive Cox,et al. Serverless inferencing on Kubernetes , 2020, ArXiv.
[11] Yuxi Li,et al. Deep Reinforcement Learning , 2018, Reinforcement Learning for Cyber-Physical Systems.
[12] David A. Patterson,et al. In-datacenter performance analysis of a tensor processing unit , 2017, 2017 ACM/IEEE 44th Annual International Symposium on Computer Architecture (ISCA).
[13] Philip Bachman,et al. Augmented CycleGAN: Learning Many-to-Many Mappings from Unpaired Data , 2018, ICML.
[14] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .
[15] Andrew McCallum,et al. Energy and Policy Considerations for Modern Deep Learning Research , 2020, AAAI.
[16] Ekaba Bisong,et al. Kubeflow and Kubeflow Pipelines , 2019, Building Machine Learning and Deep Learning Models on Google Cloud Platform.