data volume of large arrays

0 favourites
  • 5 posts
From the Asset Store
10 different explosions rendered as transparent PNG.
  • Hi.

    This may be a naive question, but..

    what is the impact of arrays on data volume? My project is based on data like musical scales in tons of variations and will store user input data, mostly reaction times, multiplying the number of entries exponentially. Might that be an issue at some point, or is the data volume of arrays small enough to not really matter?

  • Volume as in bytes? It is negligible.

    Overhead on checking or parsing every cell can get unwieldy if it gets big though. I have a little tool I use to parse my office printer's logs. The last one was 6,310 rows and 213 columns, and in .csv format it was 7mb, in .ods it was less than 1mb. I assume asJSON would be similar to csv.

    It took around a minute to run it through my construct made parser, which goes through row by row and writes relevant information to a second array. A lua script did similarly in about 10 seconds, but to be fair I didn't really ever explore different approaches for optimizing in Construct.

    As a storage medium, you probably don't have to worry about file size. Even if it somehow became an issue later, it would be simple enough to add a feature to export it into another format in chunks if so desired.

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • Hm, Ok, I don't understand what parsing is, but I'm happy to hear that I can apparently store lots of values without any issues.. Thanks! :)

  • Parsing basically just means to read through a set data and extract what you want from it.

  • Ah, ok, cool. Thanks for clearing that up!

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)