01:04:27 Josh Moore: Then Adam 01:05:01 Josh Moore: Then Andreas 01:05:36 Josh Moore: Then Aybuke 01:06:13 Josh Moore: Then Benjamin 01:06:48 Josh Moore: Then Christian 01:07:08 Josh Moore: Then David 01:07:23 Josh Moore: Then Eric 01:07:33 Josh Moore: Then Frances 01:07:58 Josh Moore: Then Giovanni 01:08:11 Josh Moore: Then Guillaume 01:08:22 Josh Moore: Then Jean-Karim 01:08:44 Josh Moore: Then Joel 01:09:22 Josh Moore: Then Ken 01:09:54 Josh Moore: Then Luca 01:10:26 Josh Moore: Then Martin 01:10:50 Josh Moore: Then Matthew 01:11:06 Josh Moore: Then Norman 01:11:12 Josh Moore: (wow. very fast) 01:11:24 Josh Moore: Then Rohola 01:11:46 Josh Moore: Then … Sourab? 01:12:57 Kevin Yamauchi: Lots of Basel folks here! 01:13:08 Josh Moore: Then Seb 01:13:25 Josh Moore: Then Tatiana 01:14:04 Josh Moore: Then Teresa 01:14:24 Josh Moore: Then Victor 01:14:59 Josh Moore: Then Willi 01:15:16 Josh Moore: Then the end of the first alphabetical run through: Wouter-Michiel ! 01:15:45 Josh Moore: Then missed people: Bishoy, Juan, … anyone else? 01:17:08 Josh Moore: Live notes for the session are available in https://hackmd.io/BqnK9Wm4QpGYAhYOoaFBQQ Where possible, help to structure the notes for later publication rather than commenting in Zoom's chat. Thanks! 01:17:18 Kevin Yamauchi: Reacted to "Live notes for the s..." with 👍 01:17:50 Joel Lüthi: Next level: Integrate metadata & chat functionality into OME-NGFF! 01:18:46 Sébastien Besson (Glencoe Software): We didn’t specify which year maybe?? 01:19:02 Kevin Yamauchi: Reacted to "We didn’t specify wh..." with 😂 01:19:26 Norman Rzepka: I think the ZEP meeting is next week 01:19:36 Joel Lüthi: Reacted to "We didn’t specify wh..." with 😂 01:19:46 Norman Rzepka: https://zarr.dev/zeps/meetings/ 01:23:07 Josh Moore: josh@openmicroscopy.org 01:23:21 Matthew Hartley (EMBL-EBI): research 01:33:19 Kenneth Ho: REMI 01:33:53 Josh Moore: Everyone needs more 4 letter acronyms… 01:34:48 Kenneth Ho: Sorry, REMBI 01:35:30 Tatiana Woller: For image analysis: https://arxiv.org/ftp/arxiv/papers/2302/2302.07005.pdf 01:36:08 Adam Taylor: Yes can confirm we have implemented MITI for HTAN 01:36:23 Josh Moore: Reacted to "Yes can confirm we h..." with 👍🏽 01:37:11 Adam Taylor: I am keen to map terms from MITI/HTAN to REMBI and other models. We have a MITI governance board meeting next week so I look forward to taking talking points from here to there 01:37:24 Josh Moore: Reacted to "I am keen to map ter..." with 👍🏽 01:37:37 Aybuke K Yoldas (EMBL-EBI): Reacted to "I am keen to map ter..." with 👍🏽 01:38:07 Denis Schapiro: Reacted to "I am keen to map ter..." with 👍🏽 01:38:50 Adam Taylor: Reacted to "I am keen to map ter..." with 👍 01:39:29 Matthew Hartley (EMBL-EBI): Reacted to "I am keen to map ter..." with 👍 01:43:10 Alex Henderson, Manchester, UK: +1 for linked data 01:44:35 Josh Moore: Reviewer on the OME-Zarr paper: “wouldn’t really say JSON is human-readable” :) 01:44:42 Norman Rzepka: Reacted to "Reviewer on the OME-..." with 🙈 01:44:44 Joel Lüthi: Reacted to "Reviewer on the OME-..." with 🙈 01:44:52 Aybuke K Yoldas (EMBL-EBI): Reacted to "Reviewer on the OME-..." with 😅 01:44:55 Kenneth Ho: Reacted to "Reviewer on the OME-..." with 😅 01:45:09 Adam Taylor: We are using LinkML internally for a number of our data models at Sage (but not HTNA currently which is in JSON-LD) and expect to expand our use of it. 01:45:11 Matthew Hartley (EMBL-EBI): Replying to "Reviewer on the OME-..." (, indent=2), it’s fine! (I kid) 01:45:21 Adam Taylor: Reacted to "Reviewer on the OME-..." with 🙈 01:45:22 Josh Moore: Reacted to "We are using LinkML ..." with 👍🏽 01:45:24 Giovanni Palla: Reacted to "Reviewer on the OME-..." with 😅 01:45:28 Benjamin Rombaut (VIB/UGent): Reacted to "Reviewer on the OME-..." with 😅 01:45:39 Josh Moore: Replying to "We are using LinkML ..." Adam: can you share a link for your models? 01:46:06 Adam Taylor: Replying to "We are using LinkML ..." https://github.com/ncihtan/data-models/ 01:46:08 Matthew Hartley (EMBL-EBI): We’re also have our REMBI implementation partly written in linkmL (and plan to move over to it as the internal source-of-truth). 01:46:12 Josh Moore: Reacted to "https://github.com/n..." with 👍🏽 01:46:18 Adam Taylor: Reacted to "We’re also have our ..." with 🚀 01:46:28 Josh Moore: Replying to "We are using LinkML ..." see also: https://www.ghga.de/resources/metadata-model 01:49:37 Norman Rzepka: Replying to "We’re also have our ..." is that public yet? 01:49:50 Tatiana Woller: Reacted to "I am keen to map ter..." with 👍🏽 01:50:08 Norman Rzepka: Reacted to "We’re also have our ..." with 🚀 01:50:11 Matthew Hartley (EMBL-EBI): Replying to "We’re also have our ..." Not at the moment, I’ll see if we can it to a point where we can share soon 01:50:25 Norman Rzepka: Reacted to "Not at the moment, I..." with 👍 01:50:39 Tatiana Woller: Reacted to "Not at the moment, I..." with 👍 01:51:19 Kenneth Ho: Reacted to "We’re also have our ..." with 🚀 01:53:58 Norman Rzepka: I think archives are in a good position to help with metadata standard consolidation, by expressing their recommended standard. 01:54:03 Josh Moore: Reacted to "Not at the moment, I..." with 👍 01:54:30 Juan Nunez-Iglesias: 💯 01:54:38 Josh Moore: Reacted to "💯" with 🙂 01:55:24 Kevin Yamauchi: Reacted to "I think archives are..." with 👍 01:56:19 Eric Perlman: Reacted to "💯" with 🙂 01:59:33 Adam Taylor: +1 for Alex’s comments 01:59:52 Rohola H.: +1 01:59:53 Juan Nunez-Iglesias: +++ 02:00:28 Juan Nunez-Iglesias: “information about the binary payload, everything else is extra” Can I have a t-shirt with that 😂 02:00:39 Joel Lüthi: Reacted to "“information about t..." with 😂 02:00:40 Eric Perlman: Reacted to "“information about t..." with ❤️ 02:00:45 Josh Moore: Replying to "“information about t..." I’d take one, too. `M` please. 02:00:48 Eric Perlman: Reacted to "“information about t..." with 👕 02:00:52 Joel Lüthi: Reacted to "“information about t..." with 👕 02:01:45 Matthew Hartley (EMBL-EBI): Reacted to "+1 for Alex’s commen..." with 👍🏻 02:04:37 Kenneth Ho: +1 Aybuke 02:05:02 Kenneth Ho: Images without metadata is just a set of numbers. 02:05:29 Aybuke K Yoldas (EMBL-EBI): Reacted to "Images without metad..." with 👊 02:05:46 Aybuke K Yoldas (EMBL-EBI): Reacted to "Images without metad..." with 🌟 02:07:19 Oliver Biehlmaier: I have to leave 02:07:42 Josh Moore: (Taking a screenshot of the above conversation :) ) 02:08:26 Josh Moore: Point to Juan for the first mention of ChatGPT 02:08:50 Aybuke K Yoldas (EMBL-EBI): Reacted to "Point to Juan for th..." with 😅 02:09:15 Juan Nunez-Iglesias: Reacted to "Point to Juan for th..." with 😅 02:09:23 Juan Nunez-Iglesias: algorithm/visualisation people love sets of numbers 😃. And sometimes metadata doesn’t make sense (as with synthetic images) 02:10:02 Matthew Hartley (EMBL-EBI): I think “What is necessary minimal metadata to understand a collection of images in a scientific context”, “What subset of this *can* be embedded in a single image” and “What subset of the *must* be embedded in a single image are different (though linked) questions, with different answers. 02:10:29 Aybuke K Yoldas (EMBL-EBI): Reacted to "I think “What is nec..." with 👍 02:11:21 Juan Nunez-Iglesias: In summary: I would like OME-NGFF to basically say: **if** you have metadata field X, **then** it should look like so. But there’s an if. An array of numbers is useful in many contexts. 02:11:39 Josh Moore: Not necessarily just a response to Matthew, but in general: I think there’s also not a single “minimal” or at least not a single set of “required fields”, but a decision tree. “If you include A then you must include B". 02:12:06 Josh Moore: HA. “Zwei Dumme ein Gedanke” 02:12:14 Juan Nunez-Iglesias: 💯 (sorry I haven’t updated my zoom so I don’t have reactions 😅) 02:12:16 Matthew Hartley (EMBL-EBI): Yes, definitely (to Josh) (also Juan) 02:15:34 Adam Taylor: I’m supportive of channel/timepoint/slice metadata being within sub-zarr groups - this could help with cloud workflows where only a few channels need to be staged eg for segmentation But we will therefore need robust tools for metadata writing, editing and validating 02:16:03 Josh Moore: Replying to "I’m supportive of ch..." … not to mention consolidation and potentially synchronization. 👍🏽 02:16:36 Sébastien Besson (Glencoe Software): From the experience of the HCS, there was definitely an aspect of compromise between splitting metadata at the appropriate levels (for the reasons above). 02:16:52 Josh Moore: Replying to "I’m supportive of ch..." Adam: with your LinkML, do you actually make use of the benefits of graph-based metadata? 02:16:53 Sébastien Besson (Glencoe Software): However everything comes at a cost and splitting means you need metadata consolidation 02:17:14 Sébastien Besson (Glencoe Software): Or rather you might need (depending on your application/use case0 02:17:26 Joel Lüthi: Replying to "From the experience ..." Yes, just wanted to write something about this as well! There's certainly good value of being able to load metadata from just one point, not many .zattrs! 02:17:55 Sébastien Besson (Glencoe Software): Replying to "From the experience ..." I knew you were going to comment on this 🙂 02:18:02 Joel Lüthi: Reacted to "I knew you were goin..." with 😇 02:18:56 Aybuke K Yoldas (EMBL-EBI): Side-topic without preference on where to write: I think to me the problem is that the bioimaging world is not there yet with enough metadata (attached to images), so adapting minimal metadata needs to be done carefully, with encouraging and supporting REMBI even if it’s not required. 02:19:27 Adam Taylor: Replying to "I’m supportive of ch..." Yes, in both our JSOn-LD and LinkML models we take advantage of dependencies, valid values and validation rules 02:19:55 Adam Taylor: Reacted to "Side-topic without p..." with ➕ 02:20:49 Josh Moore: Replying to "I’m supportive of ch..." But do you store triples in multiple locations and then merge them later (or vice versa)? 02:21:24 Juan Nunez-Iglesias: Reacted to "Side-topic without p..." with ➕ 02:21:41 Alex Henderson, Manchester, UK: Reacted to "“information about t..." with 🤷 02:21:56 Aybuke K Yoldas (EMBL-EBI): +1 to J-K 02:25:45 Sébastien Besson (Glencoe Software): You need an API 02:25:52 Matthew Hartley (EMBL-EBI): API support for this in major languages would be kind of a prerequisite. 02:27:52 Denis Schapiro: I have to jump to the next meeting. Thank you for this exciting disucssion and please do not hesitate to reach out if there are any questions related to MITI 02:28:33 Juan Nunez-Iglesias: Aybuke fwiw I think my point above is very much compatible with *encouraging* REMBI. (Just looking at the paper now.) One point I want to add is that I’m currently working with materials scientists, and I’m also planning to work with astronomers, and surprise surprise, they all have the same issues we have with data storage, metadata, etc. So personally I’m angling for ome-ngff to be broader than bioimaging, so the B in REMBI might be problematic? (Again, haven’t read the paper, will do so!) 02:29:37 Kevin Yamauchi: I have to run. Thanks, everyone! ❤️ NGFF! 02:30:08 Kenneth Ho: I have to go too. See you later in the evening 02:31:15 Adam Taylor: I hate to ever say anything nice about DICOM but would the concept of ‘profiles’ be useful here to scale/select attributes or requirements or even structure required to meet different data models, instruments or techniques “https://www.dicomstandard.org/standards/view/media-storage-application-profiles#chapter_6” 02:34:39 Aybuke K Yoldas (EMBL-EBI): Replying to "Aybuke fwiw I think ..." I’m actually an astronomer (just recently disguised as a bioinformatician). Astronomy has the wonderful FITS standard since 30-40 years but I think it’s hitting the ceiling recently of it due to big data volume (IFU type data). So would love to talk to you more on this. 02:35:46 Aybuke K Yoldas (EMBL-EBI): Replying to "Aybuke fwiw I think ..." On the B on REMBI, I was thinking like J-K, people should be able to write and provide their own standards. So REMBI would be one of them. 02:36:30 Juan Nunez-Iglesias: Replying to "Aybuke fwiw I think ..." nice! Would love to chat 02:36:37 Aybuke K Yoldas (EMBL-EBI): Reacted to "nice! Would love to ..." with 👍 02:37:06 Juan Nunez-Iglesias: Replying to "Aybuke fwiw I think ..." you can book time with me at meet.jni.codes, or email me at jni@fastmail.com 02:37:19 Aybuke K Yoldas (EMBL-EBI): Reacted to "you can book time wi..." with 👍 02:38:46 Josh Moore: fyi https://github.com/grimbough/Rarr 02:39:16 Juan Nunez-Iglesias: great name, needs a lion logo 02:39:48 Eric Perlman: Reacted to "great name, needs a ..." with 🦁 02:42:18 Josh Moore: Replying to "I hate to ever say a..." 👍🏽 but that would be per model, no? 02:43:07 Josh Moore: Replying to "Aybuke fwiw I think ..." I’ve definitely been to some meetings “what’s post-FITS?” 🙂 02:43:50 Aybuke K Yoldas (EMBL-EBI): Reacted to "I’ve definitely been..." with 😎 02:51:26 Josh Moore: In that spirit: https://gist.github.com/joshmoore/d22be276c2177d86d564997ff7e87197 :) 02:52:13 Matthew Hartley (EMBL-EBI): I’ll volunteer us 02:52:54 Aybuke K Yoldas (EMBL-EBI): Reacted to "I’ll volunteer us" with 👍 02:53:25 sourabh: I can join efforts with Wouter-Michiel 02:53:55 Rohola H.: I can join as well, don't have a model yet 02:54:47 Joel Lüthi: @Wouter & @Josh: Thanks for organizing this call! 02:54:57 Josh Moore: Reacted to "@Wouter & @Josh: Tha..." with 👍🏽 02:55:24 Aybuke K Yoldas (EMBL-EBI): QUAREP-LiMi 02:55:44 Tatiana Woller: Elixir 02:55:59 Aybuke K Yoldas (EMBL-EBI): Replying to "QUAREP-LiMi" For vendors connection as well as wider community 02:57:47 Jean-Karim Heriche: Euro-BioImaging is also a good way of reaching out to vendors. 02:58:14 Aybuke K Yoldas (EMBL-EBI): Reacted to "Euro-BioImaging is a..." with 👍 02:59:22 Tatiana Woller: Thank you 02:59:39 Teresa Zulueta-Coarasa | EMBL-EBI (she/her): Thank you!