Uploaded image for project: 'SlamData'
  1. SD-942

Add test to expose possible problem reading large datasets

    Details

    • Type: Task
    • Status: To Do
    • Priority: Medium
    • Resolution: Unresolved
    • Affects Version/s: 2.1
    • Fix Version/s: None
    • Component/s: Quasar

      Description

      As explained in comments next to the code in question, MongoDB had a very confusing API and we have an instance where we are passing in a negative limit and this might expose us to some data loss down the road.

      We need to write a test to expose the incorrect behavior and correct it.

      Here is an example of such a test:

      "read large data with limit" in {
                val COUNT = 10000
                val LIMIT = 9000
                val data = Process.range(0, COUNT).map(n => Data.Obj(ListMap("a" -> Data.Int(n))))
      
                (for {
                  tmp <- genTempFile
                  _   <- fs.save(TestDir ++ tmp, data).run
      
                  t   = fs.scan(TestDir ++ tmp, 0, Some(LIMIT)).map(_ => 1).sum.runLast
                  ds  <- t.run
                } yield {
                    ds must beRightDisjunction(Some(LIMIT))
                  }).run
              }
      

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              jr Jean-Remi Desjardins
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated: