You can follow my IP and Cybersecurity Blog here . From Summer 2022, I am again Chief Editor.

Mon blog Technologie et cybersecurité se trouve sur canadaculture.art. J'y suis de nouveau Rédacteur en chef.


October 1, 2022. Robots dream of morphing into humans to hack human emotion. A phenomenon known as Metamorphing.


May 4, 2022 - Two examples of AI generated art from a sequence of words via CC12M . This cloud-based open-source generator IMHO beats older models like Eponym.

It takes a good machine (think M1) and a lot of patience because the AI is doing all the work. Human "skill and judgment" input is so extremely minimal that I could say with certainty that current IP frameworks have gone bye bye.

Opération: Dessine moi un mouton! In the first instance I asked the Art Generator to draw an impressionism style painting of a horse in a poppy field and in the second I wanted to see a cubism style lavender guitar. And then I waited for the frames to be generated. I did nothing else (other than interacting with the code). I was impatient to see the results of the first painting, so I stopped the process after 89 frames. In the second case, I let the generator run up to 150 frames.

I didn't mint either of these, so they are as ephemeral as they come. For those who mint their products, I recommend embedding smart contracts that automatically identify copyrighted works, follow each new copy through the blockchain, fix royalty percentages, and send crypto payouts to whoever needs to be paid without the need for lawyers, courts, and jurisdictional disputes.

Given that the metaverse aims to turn each user into a creator, intellectual property rights need to be treated like basic human and robot rights. Each person on this planet generates some form of IP. They may not know it because of old world statutory definitions, but it is the case. For example, you could mint the sound of your voice and automatically get smart-royalties in your wallet when an avatar uses it for commercial or non-commercial purposes.

I believe that smart licenses and non-fungible contracts will be the most efficient way to enforce IP rights until there is a uniform metaverse court system to decide on matters arising on the metaverse. I imagined this new form of smart-court framework a month ago and started writing about it. This will be a new mixed-reality justice system accessible worldwide for free and will include a mix of algorithmic decision makers (for everything IP and objective legal tests) and some human deciders (mostly engineers). Just like art generators paint images from words we feed them, there could be visual testimony generators that will recreate real time holographic sequences of witness versions, to be compared against one another. When this paper is finished I may mint it for good luck.


March 31, 2022. Mixed Reality Cyber Couture lifts the barrier between virtual and physical realities. Here 3D fashion avatar is modeling wearable NFTs in three cities.


March 28, 2022. The first ever Metaverse Fashion Week was held on Decentraland Luxury Fashion District from March 24-27 and featured cyber-couture, virtual stores, NFT wearables, after-show parties and more by some 70 brands. No headsets required. Virtual fashion week not only democratizes high-fashion but also gives users the opportunity to experience with style, identity, body type, and gender.

In the meantime, Europe adopted the Digital Markets Act, paving the way for a new world where limits between systems and platforms will cease to exist. Interoperability is the way out of cyber-feudalism and into improved access to everything.

https://www.instagram.com/p/CbdyzLwlEt-/

As exciting as metaverse fashion week is, we are still in the dark ages here. The tech exists, but we are artificially confined into claustrophobic screen-bound experiences, where the human body is monopolized by a device. The time has come to free ourselves from 2 dimensional realities and experience depth.

For example, there is no need for handheld keyboard commands because a phone can act as a motion sensor akin to a Kinect Azur DK to track your body and face movements and replicate them/express them through your avatar in real time without the need to touch a device. This has been possible for at least 12 years now with the old Kinect AI sensor (that I myself have been repurposing for the past 6 months). Today's phone sensors are even more advanced. The reason the metaverse hasn't moved one inch in the past decade is lack of interoperability. Systems are prevented from reaching their full potential due to corporate greed.

https://www.instagram.com/p/CZ-hRHglkUv/?utm_source=ig_embed&utm_campaign=embed_video_watch_again

Rossita
Ria (Gen AI)