7
Bikonja
7y

An issue occurs, resolve it with a "Could not reproduce", client asks for a report on what happened, what parts did it affect and how will we prevent it in the future...

To make things worse it looks like it might have been an issue with MSSQL server throwing weird data out and not a problem with my code...

Comments
  • 2
    Completely unrelated. I recently had that using group_concat it hit the 1024 character limit. Due to morphing the result and just ripping the guts out did not pick up that was losing all data after that point since if it was not there my data morph just went on with the rest. Goodluck with your thing.
  • 1
    @penderis thank you!
    This is using the NTILE over a large dataset to get quantiles of partitioned data over a column, getting only the first value from the 4th quantile (group by quantile, partitionColumn) and then later in a subquery a where partition_column = x so I have no idea how there could be more the one first value (first using ROW_NUMBER) for a partition... If I could reproduce the error, though, that would be fine since I could say OK, that's what's happening, I don't know why, but it's that... But I can't reproduce it... :(
  • 1
    Sounds hectic. never worked with Ntile, and sure as hell try to avoid sql when I can. So a thought from the peanut gallery, does the uneven split of groups in the quartile results not affect what you consider the first row of the 4th column? maybe it now the last row in the 3rd column, maybe see if can determine what you should get back and if not where it is located. **surely I do not know of what I speak , but even House needed a buffer**
  • 1
    @penderis I consider the MAX(RN) value WHERE partition AND ntile=4 (the RN is DESC) the start of the group so uneven distribution should not affect it. Good try, though.
    And I stay away from SQL as much as I can as well, but that's not a lot in this job :/
    Also, a female developer that uses House quotes (and, while S04E01 was memorable as a whole, not sure how memorable the "buffer" reference is so relatively obscure references at that) - I think I'm in love :D
  • 1
    @Bikonja nope, don't , my avatar just female. She is pretty I know.
  • 1
    @penderis oh, then I'm half in love :D
    I'm in the process of rewatching house for the millionth time, love that show :)
  • 1
    @Bikonja Yip house is legend, really hoped for a house and wilson adventures movie to continue the ending, but will settle for rick and morty.
  • 2
    @penderis as much as I love the show, I think it would get stale if they continued so I'm happy... And with my shoddy memory I can rewatch a lot :D
  • 0
    God, back to work you slackers! There's ntiles to be solved.

    On that note I've not used ntile either but it'd be a pretty tricky problem for a DB to implement on live data so I'm not surprised if it screws up. Are you using nolock or read uncommitted anywhere in that query? That can cause double reads but sometimes so can standard read committed.
  • 0
    @Adamu hm, that's a good observation. I am using WITH nolock (or at least I should be :) not sure, will check when I get to work), but yeah, even if that's the case I can't prove it :(also, I feel like it might have been some caching because it was crashing too consistently for a random read error)
Add Comment