Abstract: The quality of the representations achieved by embeddings is determined by how well the geometry of the embedding space matches the structure of the data. Euclidean space has been the workhorse for embeddings; recently, non-Euclidean spaces, often used in other scientific fields, have gained attention in ML due to their ability to better embed various types of structured data. In particular, hyperbolic embeddings offer excellent quality with few dimensions when embedding hierarchical data structures. We discuss several new approaches to producing hyperbolic embeddings. When data is not structured as uniformly, we propose learning embeddings in a product manifold combining multiple copies of the canonical model spaces (spherical, hyperbolic, Euclidean), providing a space of heterogeneous curvature suitable for a wide variety of structures.