Ethereum

Ethereum’s Vitalik Buterin supports TiTok as a blockchain application

Published

on

According to Ethereum (ETH) co-founder Vitalik Buterin, the new image compression method Token for Image Tokenizer (TiTok AI) can encode images at a large enough size to add them on-chain.

On his social networks Warpcast account, Buterin called the image compression method a new way to “encode a profile photo.” He went on to say that if he could compress an image to 320 bits, what he calls “essentially a hash,” he would make the images small enough to be chained for each user.

The co-founder of Ethereum became interested in TiTok AI from an X post made by a researcher from the artificial intelligence (AI) image generation platform Leonardo AI.

The researcher, going by the name @Ethan_smith_20, briefly explained how the method could help those interested in reinterpreting high-frequency details in images successfully encode complex visuals into 32 tokens.

Buterin’s take suggests that this method could make it much easier for developers and creators to create profile photos and non-fungible tokens (NFT).

Fix previous image tokenization issues

TiTok AI, developed in a collaborative effort by TikTok parent company ByteDance and the University of Munich, is described as an innovative one-dimensional tokenization framework, a significant departure from the two-dimensional methods currently in use.

According to a research paper on the image tokenization method, AI allows TiTok to compress rendered images of 256 x 256 pixels into “32 distinct tokens.”

The paper highlighted issues with previous image tokenization methods, such as VQGAN. Previously, image tokenization was possible, but strategies were limited to using “2D latent grids with fixed subsampling factors.”

2D tokenization could not avoid the difficulties of handling redundancies found in images, and nearby regions had many similarities.

TiTok, which uses AIpromises to solve such a problem, using technologies that efficiently tokenize images into 1D latent sequences to provide a “compact latent representation” and eliminate region redundancy.

Additionally, the tokenization strategy could help streamline image storage on blockchain platforms while providing remarkable improvements in processing speed.

Additionally, it delivers speeds up to 410 times faster than current technologies, which is a huge step forward in computing efficiency.

Fuente

Leave a Reply

Your email address will not be published. Required fields are marked *

Información básica sobre protección de datos Ver más

  • Responsable: Miguel Mamador.
  • Finalidad:  Moderar los comentarios.
  • Legitimación:  Por consentimiento del interesado.
  • Destinatarios y encargados de tratamiento:  No se ceden o comunican datos a terceros para prestar este servicio. El Titular ha contratado los servicios de alojamiento web a Banahosting que actúa como encargado de tratamiento.
  • Derechos: Acceder, rectificar y suprimir los datos.
  • Información Adicional: Puede consultar la información detallada en la Política de Privacidad.

Trending

Exit mobile version