#accessible-vis

2020-07-30

Dan Bunis (17:01:44): > @Dan Bunis has joined the channel

Atul Deshpande (17:01:58): > @Atul Deshpande has joined the channel

Leonardo Collado Torres (17:01:58): > @Leonardo Collado Torres has joined the channel

Kayla Interdonato (17:05:00): > @Kayla Interdonato has joined the channel

Dan Bunis (17:06:01): > To get things started here, I’ll link the google-doc and the slides that we used for the BOF sessions.

Dan Bunis (17:06:12): > https://docs.google.com/document/d/1qCYv4vfBeKMTNMf1m6TgRvo9dyfmbR7qx772dHiuX5M/edit#

Simina Boca (17:08:09): > @Simina Boca has joined the channel

Dan Bunis (17:08:40): > https://docs.google.com/presentation/d/1PImGiuCj5a5KxM6rslXg98QggEht1oHqprV-LKwamHs/edit#slide=id.g8d3eb11ecc_3_0

Atul Deshpande (17:09:41): > Are there regulatory issues with conducting a survey? For example, if I go toreddit.com/r/colorblindwith a bunch of figures and multiple choice questions regarding the figures.

Leonardo Collado Torres (17:10:01): > My points were: > * interactivity helps resolves many of the issues (zoom, clicking on categorical color labels to turn on/off) + maybe shiny inputs for color scales (for continuous variables) does most if it. Like Dan said,plotly::ggplotly()is quite easy to use. People love interactivity even if not visual impaired, so use that as a win-win to bring in people in. Journals like eLife and F1000Research and others do allow interactive graphics. > * You just need a short paper (1 figure with data like Aedin said) to start the conversation. Quick and easy to digest. People will be like: yup, this is a problem and I can surely try X or Y solution (you can list a few, but then have a more detailed supplementary table or website or something like that) > Links: > * 2011 paper by Kasper Daniel Hansen, Zhijin, Wu Rafael Irizarry, Jeff Leekhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC3137276/. ere’s the link to the paper (same one as the PMC link above) but on the Nature journal, but it’s under a paywallhttps://dx.doi.org/10.1038%2Fnbt.1910. It’s about half a page in the Nature biotech journal format (paywalled). > * Plotly examples:http://spatial.libd.org/spatialLIBD/(includes a shiny input for the continuous color scale) andhttps://dash-gallery.plotly.host/dash-oil-and-gas/ - Attachment (PubMed Central (PMC)): Sequencing technology does not eliminate biological variability > RNA sequencing has generated much excitement for the advantages offered over microarrays. This excitement has led to a barrage of publications discounting the importance of biological variability; as microarray publications did in the 1990s. By comparing …

Leonardo Collado Torres (17:10:19): > (that’s the nicely edited version of what I wrote on the pathable chat)

Dan Bunis (17:11:27) (in thread): > This is a good question. I’m not sure.

Atul Deshpande (17:12:43) (in thread): > I’ll pin this in case someone knows the answer.

Aedin Culhane (17:14:04): > @Aedin Culhane has joined the channel

Leonardo Collado Torres (17:16:48): > I think that you could also write a second paper that is more tutorial focused on: here is how you can create some plots, check if they are colorblind-friendly, improve them, done:smiley:Or even include a section on how to add options to plotting functions that make them more friendly. Say, one of them could take a input a ggplot2 object, then make the shiny app with plotly + inputs for the color scale > > (just some ideas, dunno if they are implemented already)

Leonardo Collado Torres (17:17:19): > then that second paper could be the basis for a BioC workshop. Basically, if it’s a BioC workflow, then it’s easy to publish it at F1000Research

Simina Boca (17:18:12) (in thread): > If you put it on Reddit I’m almost 100% sure you need an IRB

Leonardo Collado Torres (17:18:13): > though that second paper will have a more narrow audience, hence why you want the other paper describing the problem & maybe mentioning some broad solutions (which is where I suggest mentioning using interactive graphs)

Simina Boca (17:18:43) (in thread): > It would probably still be considered “Human subjects research” - presumably w/ a quick IRB turnaround

Simina Boca (17:19:17) (in thread): > If it’s only co-authors that may not require it - people evaluate other papers all the time in manuscripts

Leonardo Collado Torres (17:19:18): > https://twitter.com/slowkow/status/1288945878871560197?s=20 - Attachment (twitter): Attachment > @fellgernon @atulpdeshpande @DanBunis @Bioconductor This thread may be of interest for automatically choosing distinct colors: > > https://twitter.com/jokergoo/status/1284405839252672513?s=20 - Attachment (twitter): Attachment > Does anyone know the tools that generate distinct random colors for many clusters, like the method in this post https://gdagstn.github.io/tsnecolors.html?

Leonardo Collado Torres (17:20:28): > I see that they link tohttps://cran.r-project.org/web/packages/Polychrome/vignettes/polychrome.html. I first found about this package with the R 4.0.0 release notes (or one of the R core blog posts prior to R 4.0.0) since that’s what they used to come up with the new R default colors

Simina Boca (17:20:32) (in thread): > Not sure if it would be an issue because some authors are vision impaired - as long as everyone would be OK with readers knowing that some authors are vision impaired (like I don’t think there’s stigma attached to it but I guess it is technically health information?)

Leonardo Collado Torres (17:20:46): > https://developer.r-project.org/Blog/public/2019/11/21/a-new-palette-for-r/index.html

Leonardo Collado Torres (17:20:50): > that’s the link I was thinking about

Leonardo Collado Torres (17:22:11): > :open_mouth:it has an update: > > UPDATE 2019-12-03: Following feedback, the new default palette has been tweaked so that the new "magenta" is a little redder and darker and the new "yellow" is a little lighter and brighter. The former is to improve the discriminability between "blue" and "magenta" for deuteranopes and the latter is to improve the discriminability between "green" and "yellow" for protanopes. We would like to thank those who provided feedback and suggestions on the new palette, in particular Antonio Camargo, Brenton Wiernik, Ken Knoblauch, and Jakub Nowosad. > > Maybe those are people you can invite for your projects! > > Antonio Camargo, Brenton Wiernik, Ken Knoblauch, and Jakub Nowosad. >

Atul Deshpande (17:23:25) (in thread): > Thanks. One strategy could be to show figures filtered through colorblind filters to authors with normal vision.

Dan Bunis (17:26:06): > Yes, I think these might indeed be good additional people to ask! … once we have the idea for the paper down a bit more concretely.

Leonardo Collado Torres (17:26:58): > Yup yup:smiley:

Dan Bunis (17:27:00) (in thread): > Thanks so much Simina. I think my tendency would be to avoid anything that might require an IRB.

Dan Bunis (17:28:21) (in thread): > I agree that even just the fact that the paper would make it clear that many authors are color-blind might be considered a personal health information disclosure, so this is something that maybe we could ask a journal first…

Atul Deshpande (17:29:28) (in thread): > That’s one benefit of an anonymous survey, I guess.

Dan Bunis (17:29:40) (in thread): > Perhaps again once the paper idea is more concrete… we could send a pre-submission inquiry to our intended journal where we ask for a clarification on that matter.

Dan Bunis (17:30:41) (in thread): > Personally, I don’t care if people know I’m color blind. Honestly, it’s revealed in the dittoSeq github already.

Ruth Schmidt (17:31:20): > @Ruth Schmidt has joined the channel

Matthew McCall (17:31:53): > @Matthew McCall has joined the channel

Atul Deshpande (17:32:13) (in thread): > Nor am I, but I would assume others have reservations.

Dan Bunis (17:34:19) (in thread): > :man-shrugging:maybe. As Simina pointed out, I don’t think there is much stigma, so I’m not so sure. I think we can just be up front then and make sure any additional authors would be okay with the assumed disclosure.

Atul Deshpande (17:34:21) (in thread): > Also, I guess there might be a slight difference between a github repo and a high-impact journal article.

Simina Boca (17:39:06) (in thread): > It may worth thinking a bit more about the study design

Simina Boca (17:39:32) (in thread): > Like if we show figures w/o legends

Simina Boca (17:39:53) (in thread): > We may not want to show them to the people who selected the papers in the first place

Simina Boca (17:40:20) (in thread): > If we include legends people may be influenced in terms of how many colors/clusters they see

Dan Bunis (17:41:45): > I like the plan of having a (first) paper that assesses a subset of papers from high impact journals. Perhaps with a single figure explaining our method of assessment + and example of a plot & it’s color-blind transformations + the results of our assessment. Then a brief set of suggestions of a few things that scientists & journals can do to ensure their figures are more accessible.

Aedin Culhane (17:42:22): > Anyone want to hang out/chat.  Pop intohttps://meet.bioconductor.org/yEoLukjsyI - Attachment (meet.bioconductor.org): Jitsi Meet > Join a WebRTC video conference powered by the Jitsi Videobridge

Dan Bunis (17:42:53): > There are certainly a few things to figure out for that. Perhaps the first is what that method of assessment might be.

Dan Bunis (17:43:19) (in thread): > That’s a good point.

Simina Boca (17:45:12) (in thread): > I would not necessarily be dissuaded from doing the best study we can just by the need for an IRB approval. Risk to participants is minimal so it’s not gonna be evaluated like a drug study.

Dan Bunis (17:48:21) (in thread): > I’m mainly coming from a place of having never put together an IRB. And added to those general uncertainties, also worried that the cross-institutional collaborative nature of this study might make it harder?

Aedin Culhane (17:50:10): > A few of us are just chatting if you want to join

Hyun-Hwan Jeong (18:25:36): > @Hyun-Hwan Jeong has joined the channel

Simina Boca (18:54:29) (in thread): > Yeah those things are always a somewhat added pain, but it’s doable

Martin Morgan (19:06:35): > @Martin Morgan has joined the channel

Kevin Blighe (19:24:58): > @Kevin Blighe has joined the channel

Atul Deshpande (20:22:52) (in thread): > The following could be examples for query formats: > Match the following six figures to the six legends given below: > > OR > > Which of the following legends is best associated with the given figure?

Simina Boca (20:49:31) (in thread): > I think@Aedin Culhanehad the suggestion of having it be something like “How many cluster colors do you see?” which I think may be good

Simina Boca (20:50:00) (in thread): > I think for most people it gets hard to distinguish past 5-6 colors

Simina Boca (20:50:34) (in thread): > So if we take out the legend people are really forced to rely on their eyes and not the legend

Atul Deshpande (20:51:36) (in thread): > Indeed. The criticism may be that not providing legend is intentionally reducing the interpretability of the figure.

Atul Deshpande (20:53:19) (in thread): > If I provide four well chosen candidate legends with different number of colors, the “reader” will still need to rely on their eyes to choose the right one.

Atul Deshpande (20:53:43) (in thread): > I guess the answer lies in the question we want answered.

Simina Boca (20:54:48) (in thread): > Ah I see

Simina Boca (20:54:57) (in thread): > Yes, that could be an option!

Matthew McCall (22:32:50): > I was thinking a powerful first figure in a paper on this topic might be to have panel A show the original figure from a paper and then have panels B-?? Show how that figure would look to people with different types of color blindness. I imagine it wouldn’t be too hard to find a figure that uses several different color pairs that are indistinguishable to different groups of people.

Matthew McCall (22:42:01): > As someone with deutan color blindness, I know I’ve struggled to explain to others what exactly I see, so I was really intrigued by these methods to transform an image into an approximation of what I see for someone with typical color vision

2020-07-31

Ayush Aggarwal (02:06:17): > @Ayush Aggarwal has joined the channel

Heather Turner (02:55:47): > @Heather Turner has joined the channel

Saulius Lukauskas (03:15:30): > @Saulius Lukauskas has joined the channel

Mike Smith (03:41:40): > @Mike Smith has joined the channel

Vince Carey (07:04:27): > @Vince Carey has joined the channel

Ruth Schmidt (12:04:36) (in thread): > thanks for this comment@Leonardo Collado Torrestotally agree with all of the above and wanna add a few recommendations. This is a super useful book for building interactive graphs using plotly or ggplot:https://plotly-r.com/index.html > > > - Attachment (plotly-r.com): Interactive web-based data visualization with R, plotly, and shiny > A useR guide to creating highly interactive graphics for exploratory and expository visualization.

Lambda Moses (12:04:53): > @Lambda Moses has joined the channel

Ruth Schmidt (12:07:24) (in thread): > you can use plotly or ggplotly graphs in a shiny app OR in a dash app and even publish that one alongside your publication, e.g:https://www.crisprindelphi.design/based on this paper:https://www.nature.com/articles/s41586-018-0686-x - Attachment (Nature): Predictable and precise template-free CRISPR editing of pathogenic variants > The authors use a machine-learning algorithm to predict the spectrum of CRISPR–Cas9-nuclease-mediated DNA repair outcomes at human genomic target sites.

Charlotte Soneson (12:16:20): > @Charlotte Soneson has joined the channel

Ruth Schmidt (12:20:10): > Hi all, there were a few questions yesterday that I didn’t get the chance to answer and link to the topic being discussed here. So, first: “where can I host a dash app”. For a tutorial see here:https://dashr.plotly.com/deployment. Second: “what’s is the difference to MetaboAnalayst and can it be incorporated/is it an alternative”? MetaboAnalyst is great but not all outputs are interactive and this is just an alternative way of showing how one can build their own app for metabolomics data. For the example I showed I used MetaboAnalystR but also other packages to analyze my data and then pass the output in form of a df into a plotly function to plot the interactive graphs and connect them to the components in the dash app. It’s straight forward to incorporate packages in a dash app. One would just need to run the analysis prior to starting the app. Only issue here is that if it’s a heavy dataset and takes time the app also will take time to load the datasets. But there are ways around it by e.g. by serializing dataframes using thefstpackage. - Attachment (dashr.plotly.com): Dash for R User Guide and Documentation | R & RStats | Plotly > Dash for R User Guide and Documentation. Dash is a framework for building analytical web apps in R and Python. - Attachment (fstpackage.org): Lightning Fast Serialization of Data Frames for R > Multithreaded serialization of compressed data frames using the fst format. The fst format allows for random access of stored data and compression with the LZ4 and ZSTD compressors created by Yann Collet. The ZSTD compression library is owned by Facebook Inc.

Ruth Schmidt (12:27:19) (in thread): > @Matthew McCallI had this discussion with a colleague. What do you think about the RColorBrewer colorblind-friendly palettes? > > display.brewer.all(colorblindFriendly = TRUE) >

Ruth Schmidt (12:29:12) (in thread): > generally, what colour palettes would you recommend to use from the ggplot2 or other libraries?

Matthew McCall (13:17:36) (in thread): > I generally try to avoid color in paper figures because it’s hard for me to find colors I like and am confident work for a lot of people. Also a lot of journals still charge extra for color figures. But done of that’s a real solution. I do tend to use those RColorBrewer palettes for when I am using color, but I’m definitely open to something better.

Rene Welch (16:10:07): > @Rene Welch has joined the channel

Dr Awala Fortune O. (16:13:51): > @Dr Awala Fortune O. has joined the channel

2020-08-01

Matt Ritchie (02:06:01): > @Matt Ritchie has joined the channel

2020-08-03

Jenny Drnevich (10:06:19): > @Jenny Drnevich has joined the channel

Aedin Culhane (16:13:46) (in thread): > Hi Dan, It could be as simple, as how many colors/clusters do you see? Which ones are difficult to determine. Then you could present % recall and even precision… Funny the precision/recall wikipedia image isn’t color-blind friendly - File (PNG): image.png

Aedin Culhane (16:15:06) (in thread): > You could even, simulate Umap cluster maps with the most popular color palettes and crowd source.. how many colors/clusters do you see

2020-08-05

Leonardo Collado Torres (02:45:32): > https://twitter.com/rfortherest/status/1290679204481294337?s=12 - Attachment (twitter): Attachment > Make sure your #rstats plots are colorblind friendly using {colorblindr} by @ClausWilke > > https://bit.ly/2WTqvLS https://pbs.twimg.com/media/Eelp5W7WoAA0le9.png

2020-08-07

Lauren Hsu (00:21:55): > @Lauren Hsu has joined the channel

2020-09-10

Joselyn Chávez (10:39:03): > @Joselyn Chávez has joined the channel

2020-09-14

Aedin Culhane (13:10:54): > Nice visualization of 40 color flow cytometry immunophenotyping of cell subsets in human peripheral bloodhttps://t.co/7enaBk5bDN?amp=1 - File (PNG): image.png

Aedin Culhane (13:11:49): > Is the numbering of clusters sufficiently good for@Dan Bunis@Atul DeshpandeCould the colors be better?

Atul Deshpande (13:57:37): > That’s actually very clearly readable for me.

Dan Bunis (14:18:09): > Pretty clear for me too. > I’d say the colors for the separate groups could be updated as the CD4Tcell/Monocyte/Bcell colors bleed together for me, and same for the CD8Tcell/NK. But it really feels like a minor point because the numbers allow me to tell what’s what.

2020-10-01

Aedin Culhane (10:15:50): > Any update here?

2020-10-17

Kevin Blighe (10:16:02): > @Kevin Blighe has joined the channel

2020-11-16

Carmen Abaurre (06:11:10): > @Carmen Abaurre has joined the channel

2021-01-01

Soumya Banerjee (09:40:14): > @Soumya Banerjee has joined the channel

2021-01-22

Jonathan Speh (05:13:46): > @Jonathan Speh has joined the channel

Annajiat Alim Rasel (15:41:05): > @Annajiat Alim Rasel has joined the channel

2021-03-12

Julia Romanowska (14:50:39): > @Julia Romanowska has joined the channel

2021-05-11

Megha Lal (16:43:42): > @Megha Lal has joined the channel

2021-05-21

Guido Barzaghi (04:14:17): > @Guido Barzaghi has joined the channel

2021-05-24

MarieV (13:42:09): > @MarieV has joined the channel

2021-07-23

Batool Almarzouq (15:51:10): > @Batool Almarzouq has joined the channel

2021-08-04

Ayush Aggarwal (19:19:59): > @Ayush Aggarwal has joined the channel

2021-10-10

Mike Smith (05:00:00): > Does anyone have any colour palettes they recommend for accessibility? I’m considering adding some text in the BiocStyle vignette about colour choice, and it’d be nice to link to some existing/recommended palettes.

Simina Boca (13:48:18): > Check outhttps://colorbrewer2.org/#type=sequential&scheme=BuGn&n=3

2021-10-11

Jenny Drnevich (11:45:02): > I use dittoSeq::dittoColors()

Mike Smith (14:30:53): > Thanks for the suggestions. I made a pull request to BiocStyle (https://github.com/Bioconductor/BiocStyle/pull/91) and mentioned both.

2021-11-08

Paula Nieto García (03:18:37): > @Paula Nieto García has joined the channel

2021-12-14

Megha Lal (08:23:14): > @Megha Lal has left the channel

2022-02-15

Gene Cutler (11:59:28): > @Gene Cutler has joined the channel

2022-03-14

Mike Smith (17:19:34): > Does anyone prefer to use a “dark mode” when it’s given as an option on a website. Have you struggled with the colours in Bioconductor vignettes? I’m wondering if a dark mode switch is an enhancement we could add to BiocStyle for package vignettes, but I’ve no idea how big the audience is that we’re not serving with the current theme.

Lambda Moses (19:49:37) (in thread): > I think that would be nice. I usually change the RStudio theme to a dark one in the evening so my eyes feel more comfortable.

2022-06-09

John Hutchinson (09:07:24): > @John Hutchinson has joined the channel

2022-07-29

Jill Lundell (16:38:00): > @Jill Lundell has joined the channel

2022-09-17

Kartik (11:33:59): > @Kartik has joined the channel

2022-10-23

Vince Carey (13:36:59): > @Vince Carey has left the channel

2023-03-10

Joaquin Reyna (15:26:54): > @Joaquin Reyna has joined the channel

Edel Aron (15:34:58): > @Edel Aron has joined the channel

2023-05-06

Sarah Parker (14:03:44): > @Sarah Parker has joined the channel

2023-05-15

Michael Lynch (07:53:01): > @Michael Lynch has joined the channel

2023-06-19

Pierre-Paul Axisa (05:08:27): > @Pierre-Paul Axisa has joined the channel

2023-08-28

Abdullah Al Nahid (15:05:40): > @Abdullah Al Nahid has joined the channel

2023-09-15

Leo Lahti (04:53:00): > @Leo Lahti has joined the channel

2023-10-15

Lambda Moses (21:02:16): > Saw this package on Mastodon. It looks very helpful for accessibility:https://github.com/nicucalcea/Ra11y

2024-03-20

Lambda Moses (13:22:57): > I try to make my plotting functions colorblind friendly. I wonder what do colorblind people think of fluorescent images which may have red and green fluorophores. ImageJ also uses red and green colorization when the metadata doesn’t indicate a color for the channel. One can always simply view each channel separately. But I wonder if there already is a coloblind friendly way to colorize different channels in fluorescent images to make a nice looking pseudocolor image. I suppose one can use a colorblind friendly palette, but then what about when two colors colocalize, like yellow when we have red and green at the same pixel?

2024-04-28

Danielle Callan (08:18:54): > @Danielle Callan has joined the channel

2024-11-17

Vince Carey (05:06:57): > @Vince Carey has joined the channel

2025-02-06

Liz Hare (11:49:32): > @Liz Hare has joined the channel

2025-03-05

Benjamin Hernandez Rodriguez (22:26:10): > @Benjamin Hernandez Rodriguez has joined the channel

2025-03-17

Sunil Nahata (09:26:02): > @Sunil Nahata has joined the channel

2025-03-18

Andres Wokaty (14:26:03): > @Andres Wokaty has joined the channel