Follow
Maxwell Nye
Maxwell Nye
adept.ai
Verified email at alum.mit.edu - Homepage
Title
Cited by
Cited by
Year
Program synthesis with large language models
J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
6162021
Show your work: Scratchpads for intermediate computation with language models
M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ...
arXiv preprint arXiv:2112.00114, 2021
3892021
Dreamcoder: growing generalizable, interpretable knowledge with wake–sleep bayesian program learning
K Ellis, L Wong, M Nye, M Sable-Meyer, L Cary, L Anaya Pozo, L Hewitt, ...
Philosophical Transactions of the Royal Society A 381 (2251), 20220050, 2023
1822023
Dreamcoder: Bootstrapping inductive program synthesis with wake-sleep library learning
K Ellis, C Wong, M Nye, M Sablé-Meyer, L Morales, L Hewitt, L Cary, ...
Proceedings of the 42nd acm sigplan international conference on programming …, 2021
1492021
Write, execute, assess: Program synthesis with a repl
K Ellis, M Nye, Y Pu, F Sosa, J Tenenbaum, A Solar-Lezama
Advances in Neural Information Processing Systems 32, 2019
1422019
Learning to infer program sketches
M Nye, L Hewitt, J Tenenbaum, A Solar-Lezama
International Conference on Machine Learning, 4861-4870, 2019
1152019
Learning compositional rules via neural program synthesis
M Nye, A Solar-Lezama, J Tenenbaum, BM Lake
Advances in Neural Information Processing Systems 33, 10832-10842, 2020
1102020
Implicit representations of meaning in neural language models
BZ Li, M Nye, J Andreas
arXiv preprint arXiv:2106.00737, 2021
1082021
Improving coherence and consistency in neural sequence models with dual-system, neuro-symbolic reasoning
M Nye, M Tessler, J Tenenbaum, BM Lake
Advances in Neural Information Processing Systems 34, 25192-25204, 2021
852021
The variational homoencoder: Learning to learn high capacity generative models from few examples
LB Hewitt, MI Nye, A Gane, T Jaakkola, JB Tenenbaum
arXiv preprint arXiv:1807.08919, 2018
732018
Communicating natural programs to humans and machines
S Acquaviva, Y Pu, M Kryven, T Sechopoulos, C Wong, G Ecanow, M Nye, ...
Advances in Neural Information Processing Systems 35, 3731-3743, 2022
422022
Show your work: Scratchpads for intermediate computation with language models, 2021
M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ...
URL https://arxiv. org/abs/2112.00114, 2021
292021
Representing partial programs with blended abstract semantics
M Nye, Y Pu, M Bowers, J Andreas, JB Tenenbaum, A Solar-Lezama
arXiv preprint arXiv:2012.12964, 2020
252020
Program synthesis with large language models. CoRR abs/2108.07732 (2021)
J Austin, A Odena, MI Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
242021
Are efficient deep representations learnable?
M Nye, A Saxe
arXiv preprint arXiv:1807.06399, 2018
222018
A large-scale benchmark for few-shot program induction and synthesis
F Alet, J Lopez-Contreras, J Koppel, M Nye, A Solar-Lezama, ...
International Conference on Machine Learning, 175-186, 2021
212021
Program synthesis with large language models (2021)
J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ...
arXiv preprint arXiv:2108.07732, 2021
122021
Language modeling with latent situations
BZ Li, M Nye, J Andreas
arXiv preprint arXiv:2212.10012, 2022
82022
Larc: Language annotated abstraction and reasoning corpus
S Acquaviva, Y Pu, M Nye, C Wong, MH Tessler, J Tenenbaum
Proceedings of the Annual Meeting of the Cognitive Science Society 43 (43), 2021
42021
Prompting Machine-Learned Models Using Chains of Thought
JW Wei, D Zhou, DE Schuurmans, QV Le, MP Bosma, EHH Chi, ...
US Patent App. 17/881,746, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–20