Page 1 of 2 12 LastLast
Results 1 to 10 of 15

Thread: Improve the cope/paster data capability in Lubuntu?

  1. #1
    Join Date
    Jun 2021
    Beans
    12

    Improve the cope/paster data capability in Lubuntu?

    Hi. I switch to Lubuntu desktop in ubuntu 20.04lts recently. Some display or use problems don't happen again. But when a large amount of data columns are copied from LibreCal and pasted in other software, the system could freeze, LibreCal will be stuck. So are there some optimization or customization methods to improve the data transmission during copy and paste operation?
    Last edited by zhuxysoler; June 21st, 2021 at 02:19 AM.

  2. #2
    GhX6GZMB is offline Iced Almond Soy Ubuntu, No Foam
    Join Date
    Jun 2019
    Beans
    1,093

    Re: Improve the cope/paster data capability in Lubuntu?

    Either more RAM or a swap partition.

  3. #3
    Join Date
    Jun 2021
    Beans
    12

    Re: Improve the cope/paster data capability in Lubuntu?

    Quote Originally Posted by ml9104 View Post
    Either more RAM or a swap partition.
    The computer has 20GB RAM and gave 15GB for SWAP. When coping data, the RAM usage and SWAP didn't change suddenly before the system freeze.

  4. #4
    Join Date
    Aug 2017
    Location
    melbourne, au
    Beans
    Hidden!
    Distro
    Lubuntu Development Release

    Re: Improve the cope/paster data capability in Lubuntu?

    It may help if we know what release you're asking about.

  5. #5
    Join Date
    Jun 2021
    Beans
    12

    Re: Improve the cope/paster data capability in Lubuntu?

    Quote Originally Posted by guiverc View Post
    It may help if we know what release you're asking about.
    Sorry for missing this information. I use Lubuntu desktop in ubuntu 20.04 lts.

  6. #6
    Join Date
    Aug 2013
    Beans
    4,941

    Re: Improve the cope/paster data capability in Lubuntu?

    Does it take long time for your data to load? What format is the table in (csv, ods, xsl, xlsx)? What application do you cut and paste to?

    Yes, when the volume is huge it does take a while for operation to complete. It may help a bit if you enable multuthreading.

    open Calc, go to tools > options > Libreoffice Calc > Calculate > check the "Enable multithread calculation" under CPU threading settings if it is not already checked.

    I use a big test file with random entries (with three different versions .csv, .ods, and .xlsx, the ods is 10 MB, the XLSX is 16 MB, the .csv is 40M, maybe because I lost the original and created this from the .ods.) to see how long it takes to load and save, It is very fast for .csv. both loading and saving, but it does take a while for the .ods and .xlsx and the window turns grey but not nearly as bad as some people reported (takes maybe 1 minutes, with 6G of Ram) . the .csv is issue free, .ods is ok except maybe slower, The .XLSX is most resource demanding performs worst is often becomes unresponsive when editing,

    So this experiment kind of confirms the general point that spreadsheet data should be in .csv rather than EXCEL (for example) for ease to process and interportability.(well TBH the real answer is one shouldn't use a spreadsheet program for serious data processing period but a real data analysis software like R or Python)

    LO 7.1.4.2


    I think calc also has a limit for number of columns so for csv with a large number of columns I use gnumeric to view the file instead.
    Last edited by monkeybrain20122; June 22nd, 2021 at 01:37 AM.

  7. #7
    Join Date
    Jun 2021
    Beans
    12

    Re: Improve the cope/paster data capability in Lubuntu?

    Quote Originally Posted by monkeybrain20122 View Post
    Does it take long time for your data to load? What format is the table in (csv, ods, xsl, xlsx)? What application do you cut and paste to?

    Yes, when the volume is huge it does take a while for operation to complete. It may help a bit if you enable multuthreading.

    open Calc, go to tools > options > Libreoffice Calc > Calculate > check the "Enable multithread calculation" under CPU threading settings if it is not already checked.

    I use a big test file with random entries (with three different versions .csv, .ods, and .xlsx, the ods is 10 MB, the XLSX is 16 MB, the .csv is 40M, maybe because I lost the original and created this from the .ods.) to see how long it takes to load and save, It is very fast for .csv. both loading and saving, but it does take a while for the .ods and .xlsx and the window turns grey but not nearly as bad as some people reported (takes maybe 1 minutes, with 6G of Ram) . the .csv is issue free, .ods is ok except maybe slower, The .XLSX is most resource demanding performs worst is often becomes unresponsive when editing,

    So this experiment kind of confirms the general point that spreadsheet data should be in .csv rather than EXCEL (for example) for ease to process and interportability.(well TBH the real answer is one shouldn't use a spreadsheet program for serious data processing period but a real data analysis software like R or Python)

    LO 7.1.4.2


    I think calc also has a limit for number of columns so for csv with a large number of columns I use gnumeric to view the file instead.
    Thanks for introducing your experience. I open csv and txt files in LibreCal, usually txt files. The txt and csv files have a much larger size than xlsx. The "Enable multithread calculation" was checked by default. I test the copy performance. For example, I copy 90000 rows * 1 column data. After click copy, the system got stuck for a while, about half minute. Ram usage increases from 8GB to 11GB. After LibreCal restored operation again. The data can be pasted in Qtiplot. So during the system stuck time, no operations can be done, just wait a moment. At least, the data process can be done except taking more time.
    Yes. use code to process giant amount of data is a better way. I also use matplotlib to visuallize the data in some cases, but need more time to write code.

  8. #8
    Join Date
    Jun 2021
    Beans
    12

    Re: Improve the cope/paster data capability in Lubuntu?

    Quote Originally Posted by monkeybrain20122 View Post
    Does it take long time for your data to load? What format is the table in (csv, ods, xsl, xlsx)? What application do you cut and paste to?

    Yes, when the volume is huge it does take a while for operation to complete. It may help a bit if you enable multuthreading.

    open Calc, go to tools > options > Libreoffice Calc > Calculate > check the "Enable multithread calculation" under CPU threading settings if it is not already checked.

    I use a big test file with random entries (with three different versions .csv, .ods, and .xlsx, the ods is 10 MB, the XLSX is 16 MB, the .csv is 40M, maybe because I lost the original and created this from the .ods.) to see how long it takes to load and save, It is very fast for .csv. both loading and saving, but it does take a while for the .ods and .xlsx and the window turns grey but not nearly as bad as some people reported (takes maybe 1 minutes, with 6G of Ram) . the .csv is issue free, .ods is ok except maybe slower, The .XLSX is most resource demanding performs worst is often becomes unresponsive when editing,

    So this experiment kind of confirms the general point that spreadsheet data should be in .csv rather than EXCEL (for example) for ease to process and interportability.(well TBH the real answer is one shouldn't use a spreadsheet program for serious data processing period but a real data analysis software like R or Python)

    LO 7.1.4.2


    I think calc also has a limit for number of columns so for csv with a large number of columns I use gnumeric to view the file instead.
    THX. I have a try. I try some cases gnumeric is quicker than LibreCal.
    Last edited by zhuxysoler; June 22nd, 2021 at 04:58 AM.

  9. #9
    Join Date
    Mar 2010
    Location
    Squidbilly-Land
    Beans
    Hidden!
    Distro
    Ubuntu

    Re: Improve the cope/paster data capability in Lubuntu?

    IDK, but does copying just the values, not the formatting make it work faster?
    Also, are non-ASCII characters used? Just another point of information - if characters are unicode, they could be 4x larger.

    For anything larger than 500 rows, I'd look for a batch solution and avoid the GUI completely.

  10. #10
    Join Date
    Aug 2013
    Beans
    4,941

    Re: Improve the cope/paster data capability in Lubuntu?

    So I copied one column with 1048576 rows form my test csv in Cacl and paste it to the text editor just as an experiment. I don't experience the problem you described. Copy was instantaneous with no lockup at all , paste took longer and text editor turned grey for a little while indicating high usage of memory but that lasted for only a few seconds and there was no systrm lockup even when the text editor window was grey, I could still switch to Firefox. I have only 6 G of ram on this machine running Ubuntu 20.04 (with unity desktop, haven't tested it on gnome) My column consists of random gibberish (long strings with both alphabets and numerals)
    Last edited by monkeybrain20122; June 22nd, 2021 at 06:20 AM.

Page 1 of 2 12 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •