The brain is one of the most energy intense organs. Some of this energy is used for neural information processing, however, fruitfly experiments have shown that also learning is metabolically costly. First, we will present estimates of this cost, introduce a general model of this cost, and compare it to costs in computers. Next, we turn to a supervised artificial network setting and explore a number of strategies that can save energy need for plasticity, either by modifying the cost function, by restricting plasticity, or by using less costly transient forms of plasticity. Finally, we will discuss adaptive strategies and possible relevance for computer hardware.