Abstract
We consider optimisation in the context of the need to apply an optimiser to a continual stream of instances from one or more domains, and consider how such a system might 'keep learning': by drawing on past experience to improve performance and learning how to both predict and react to instance and/or domain drift.
| Original language | English |
|---|---|
| Title of host publication | GECCO '23 companion |
| Subtitle of host publication | proceedings of the Companion Conference on Genetic and Evolutionary Computation |
| Editors | Sara Silva, Luís Paquete |
| Place of Publication | New York, NY |
| Publisher | ACM |
| Pages | 1636-1638 |
| Number of pages | 3 |
| ISBN (Electronic) | 9798400701207 |
| DOIs | |
| Publication status | Published - 24 Jul 2023 |
| Event | Genetic and Evolutionary Computation Conference 2023 (GECCO'23) - Lisbon, Portugal Duration: 15 Jul 2023 → 19 Jul 2023 https://gecco-2023.sigevo.org/HomePage |
Conference
| Conference | Genetic and Evolutionary Computation Conference 2023 (GECCO'23) |
|---|---|
| Country/Territory | Portugal |
| City | Lisbon |
| Period | 15/07/23 → 19/07/23 |
| Internet address |
Fingerprint
Dive into the research topics of 'Towards optimisers that 'Keep Learning''. Together they form a unique fingerprint.Projects
- 1 Finished
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver