8 comments

  • coldcode 3 hours ago
    It's so interesting to see how much of a commodity charting/graphing has become. When we started building Deltagraph in late 1988, what we made become a kind of standard since we targeted Postscript and Illustrator output, and included almost every kind of chart we could find with ridiculous options for everything, so people used it world wide, especially if targeting print. In the mid-90's, it was sold by the publisher (we just did the dev), and it spent the next 25 years at various owners before dying during the pandemic, all still based on the original source code (C) I started. I can't imagine how bad the code looked by then...
    • dcreater 2 hours ago
      And yet it's still not sufficiently commoditized and widespread. The majority of the working force is using proprietary solutions that are out of date - Tableau, JMP in HW engineering, SAS and Excel
  • SubiculumCode 3 hours ago
    Sure ggplot, for example, is finicky, and you need to fuss over it to get the look you are wanting, but then again, it is very flexible. Most of these solutions get frustrating as soon as you want to do, for example, spaghetti plots of within subject repeated measures using age (not time-point) of accelerated longitudinal design data, with fixed effect plots on top. e.g. this plot of mine [1] [1] https://imgur.com/a/gw2vV7w
    • nxobject 17 minutes ago
      I just needed to stop and say: as a biostatistician, boy do I love a beautiful complex longitudinal design: I remember my old professor asking us how at this point we would decompose into cross-sectional and longitudinal effects, Lord's paradox, etc... and I still don't fully understand Lord's paradox as well as I should.
  • jtrueb 6 hours ago
    Obviously there is a lot of work here, but I am a bit confused. If you already have lab code in Julia, Matlab, R, Python, Excel, etc., what is the motivation to use this tool? Is this hot in a specific community?
    • tonyarkles 4 hours ago
      I'm in potentially the target demographic for this. I regularly bounce between R, Python, Maxima, and occasionally MATLAB/Octave. Passing data between these is usually done using the lowest common denominator: CSV. Having four completely different interfaces to these tools is a hassle. I'm also not a big fan of Jupyter and if this feels better for me it might be a decent Jupyter replacement even without the cross-language stuff.
      • MostlyStable 1 hour ago
        I'm someone who enjoys figuring out the details of making a nice looking plot (in base R, I can't stand ggplot), but even as someone who enjoys it, LLMs are pretty much good enough that if I explain to them how I want the plot to look and how my data is structured, they can generate code that works first shot. It seems to me that, at this point, if you are already doing some coding in one of the above languages but either don't like or aren't comfortable making the plots using them, that LLMs can solve it for you. Unless they are significantly worse in the non-R options (which could be the case, It wouldn't surprise me if R has more plotting examples in the training set than the other languages).
    • jabl 5 hours ago
      I suppose this is a FOSS solution for the roughly same space occupied by commercial tools like Origin, that are very popular in some scientific communities.

      They can be useful if you have other tools (e.g. measurement software) that already produces the data you want, and you just want a GUI tool to create plots, and maybe do some simple things like least squares curve fitting etc.

      If you already do a lot of data wrangling in something with a programming language and plotting libraries accessible from said language, like the ones you mention, yeah, this is not the tool for you.

      • ajot 5 hours ago
        It is! I remember using this (or SciDavis, a related project) a couple of years back in college. It was not as powerful as Origin 10 years ago, but it ran on Linux.

        This is great for people who don't know nor want to learn to program.

        • pvitz 1 hour ago
          Same experience here! We used Origin and/or QtiPlot in a physics lab for the graphs and quick regressions.
    • wodenokoto 21 minutes ago
      Haven't tried this tool yet, but if it lets me drag and drop my data and visuals, that sounds like a great addition to those tools.
    • goku12 1 hour ago
      It's the use case. Here is one concrete example. I worked as a project engineer during the development of a launch vehicle. The telemetry data frames from every test and every flight were processed into numerous CSV or TSV files that were labeled with the parameter name. Those files could be very large depending up on their sampling rates, especially for tests that lasted hours on end. You would conduct exploratory manual analysis on that data which involves:

      * Quickly cycle visually through time series graphs (often several hundred parameters). You'd have seen most of those parameters before and would quickly catch any anomalies. You can clear so much data rapidly like this.

      * Quickly analyze a graph at various zoom and pan settings. May be save some as images for inclusion in documents. Like above, the zoom and pan operations often follow each other in a matter of seconds.

      * Zoom into fine details, down to single bit levels or single sample intervals. There's surprising amount information you can glean even at these levels. I have run into freak, but useful single events at these levels. And since they're freak events, it's hard to predict in advance where they'd show up. So operation speed becomes a key factor again.

      * Plot multiple parameters (sometimes with different units) together to assess their correlation or unusual events. We used to even have team analysis sessions where such visualizations were prepared on demand.

      * Do statistical or spectral analysis (like periodograms, log or semi-log graphs, PDFs, etc)

      * Add markers or notes within the graph (usually to describe events). Change the axes or plot labels. Change grid value formatting (eg: Do you want time in seconds or HMS?).

      All the operations above are possible with Julia, Matlab, R or Python. And we did use almost all of them (depending on personal preference). But none of them suit the workflow described above for one simple reason - speed. You don't have enough time to select each parameter by text or GUI. There must be a way to either quickly launch a visualization or cycle through the parameters as the investigator closes each graph. You also don't have time to set zoom, pan and labels by text. It must be done using mouse (zoom & pan) and directly on the graph (labels and markers) in a WYSIWYG manner. And you don't want to run an FFT or a filter function, save the new series and then plot it - you want it done with a single menu selection. The difference is like using a C++ compiler vs Python in JupyterLab. The application we used was very similar to Labplot.

      Now, Excel might seem like a better choice. In fact, LabPlot and our application all has a spreadsheet-like interface with the ability to directly import CSV, TSV, etc. But Excel just doesn't cross the finish line for our requirement. For example, to plot a time series in excel, you have to select the values (column or cells), designate the axes, optionally define the axes and graph labels, start a plot, expand it to required levels and format the print. At that rate, you wouldn't finish the analysis in a month. Those applications would do all that on their own (the labels and other metadata were embedded in the data files by means of formatted comments). But an even bigger problem was the size of the data. Some of those files on import would slow down Excel to speed of molasses. The application had disk and memory level buffering to significantly improve the responsiveness to almost instant interactivity.

      I hope this gives you an idea where the tools that you mentioned are not good enough replacements for LabPlot and similar tools.

      • tonyarkles 31 minutes ago
        Thank you for this fantastic elaboration. I am in a very similar boat (unmanned aerospace) and have very similar needs. I’ve been chewing on making my own application to do this but LabPlot looks like it has potential to be exactly what I’ve been dreaming about for a few years.
  • givinguflac 7 hours ago
    Pretty sure this is the project github:

    https://github.com/KDE/labplot

  • cl3misch 8 hours ago
    HN hug of death?
  • RedShift1 7 hours ago
    Unfortunately the only database it supports is SQLite, I really wanted to hook this up directly to a database or REST API. Going back and forth between exporting files and importing them into LabPlot is just too much work...
  • ntxvega1975 5 hours ago
    I can't tell what license is applicable.
    • echoangle 5 hours ago
      On https://labplot.org/frequently-asked-questions/ , under "Under what license is LabPlot released?", it says this:

      > LabPlot is licensed under GNU General Public License, version 2.0 or later, so to put it in a few sentences:

      > You are free to use LabPlot, for any purpose

      > You are free to distribute LabPlot

      > You can study how LabPlot works and change it

      > You can distribute changed versions of LabPlot

      > In the last case you have the obligation to also publish the changed source code as GPL.