ROS testing on real hardware

Here you can discuss ReactOS related topics.

Moderator: Moderator Team

Post Reply
Webunny
Posts: 1201
Joined: Sat Apr 28, 2012 1:30 pm

Re: ROS testing on real hardware

Post by Webunny »

milon wrote:
Webunny wrote:@milon: feel free to add your rig there as well
I was going to wait until it worked on one, but now you've got me asking myself Why wait? I'll put both test machines on there as soon as I get a chance (couple days maybe). :)

EDIT - We should eventually standardize the "Working Builds" column so that sorting by that column results in something useful. Maybe we should put up a template or something?
Well, it is already rather standardised, but maybe if we're going to place working and non-working builds in the same 'testing', it might be useful to be able to see it at a glance. I'm not sure if you can work with colors (for the characters) on the wiki? Otherwise, we could establish the rule that working builds are placed in green, not-working once in red, and partially working ones in orange letters. Also...maybe we also would have to agree that for good and non-working builds, we only place the first and the latest one(s), with a date, so we can get an impression of how ROS evolutes over time.

Also, maybe the " https://www.reactos.org/wiki/PC_ROS_Rigs " should be place in a more obvious (sticky thread?) place and hardware-testing related thread.

middings
Posts: 1014
Joined: Tue May 07, 2013 9:18 pm
Location: California, USA

Re: ROS testing on real hardware

Post by middings »

Webunny wrote:(W)e could establish the rule that working builds are placed in green, not-working on(es) in red, and partially working ones in orange letters.
I'm puzzled about what "working" means in this context. A set of defined, objective criteria that is more specific than "working" and "partially working" might describe the results in a much more useful and consistent manner.

Webunny
Posts: 1201
Joined: Sat Apr 28, 2012 1:30 pm

Re: ROS testing on real hardware

Post by Webunny »

middings wrote:
Webunny wrote:(W)e could establish the rule that working builds are placed in green, not-working on(es) in red, and partially working ones in orange letters.
I'm puzzled about what "working" means in this context. A set of defined, objective criteria that is more specific than "working" and "partially working" might describe the results in a much more useful and consistent manner.
Working means it runs as is expected of that build. Partially working is when it runs, but not as expected. Not working means it doesn't run at all. The only one that might need additional info is the 'partially' , but that's what the comments-section is for. Those are pretty clear criteria, me thinks. I don't see any insurmountable difficulties in keeping it relatively simple like that, since it's meant to show an overview and see results at a glance...so I wouldn't make it too complicated or detailed there. If there really is need to delve deeper into a particular question/error/issue, one can use a link to point people towards it.

milon
Posts: 969
Joined: Sat Sep 05, 2009 9:26 pm

Re: ROS testing on real hardware

Post by milon »

middings wrote:
Webunny wrote:(W)e could establish the rule that working builds are placed in green, not-working on(es) in red, and partially working ones in orange letters.
I'm puzzled about what "working" means in this context. A set of defined, objective criteria that is more specific than "working" and "partially working" might describe the results in a much more useful and consistent manner.
I'm kind of with middings on this. The definition for "working" will change over time as ROS improves. It may also change if/when significant core changes are made that introduce regressions or unearth nasty bugs. Personally, I would simplify it down to 2 categories: Working (at minimum it boots to desktop) and Not Working (fails to reach desktop). The comments section can be used to elaborate on anything else.

User avatar
Black_Fox
Posts: 1584
Joined: Fri Feb 15, 2008 9:44 pm
Location: Czechia

Re: ROS testing on real hardware

Post by Black_Fox »

I don't have any HW at hand now, so pardon my intrusion: How about "working" means "same as when ran in VM", "partially working" means "at least boots", while non-working is even worse than that?

fred02
Posts: 551
Joined: Thu Nov 22, 2007 5:54 pm

Re: ROS testing on real hardware

Post by fred02 »

milon wrote:Personally, I would simplify it down to 2 categories: Working (at minimum it boots to desktop) and Not Working (fails to reach desktop). The comments section can be used to elaborate on anything else.
Sound as a good start. I would also distinguish: boot to first (second, third) stage with a link to a Jira bug report, if available.
Black_Fox wrote:How about "working" means "same as when ran in VM", "partially working" means "at least boots", while non-working is even worse than that?
I don't think using a VM as a reference is a good idea, as some VM versions were known to fail with ROS in the past.

A more formal way to test the functionality would be to run ROS test suit, but I don't know if/how it is feasible at the present state.

User avatar
Black_Fox
Posts: 1584
Joined: Fri Feb 15, 2008 9:44 pm
Location: Czechia

Re: ROS testing on real hardware

Post by Black_Fox »

fred02 wrote:I don't think using a VM as a reference is a good idea, as some VM versions were known to fail with ROS in the past.
It's not a great reference, but at least same VBox version should work the same for everyone, which totally cannot be said about real HW :)

milon
Posts: 969
Joined: Sat Sep 05, 2009 9:26 pm

Re: ROS testing on real hardware

Post by milon »

Sounds like a standardized benchmark to test against might be useful. VirtualBox could make a good candidate, but it's still under development which means bugs, regressions, different versions, etc. Any suggestions?

oldman
Posts: 1078
Joined: Sun Dec 20, 2009 1:23 pm

Re: ROS testing on real hardware

Post by oldman »

by milon » 06 Jan 2014 16:33
Personally, I would simplify it down to 2 categories: Working (at minimum it boots to desktop) and Not Working (fails to reach desktop). The comments section can be used to elaborate on anything else.
I would agree with the above quote from milon, but, 'Working = boots to desktop' needs to be quantified by a programme working, such as; Explorer opens and files can be viewed. There maybe times when you can boot into the desktop, but only the mouse works, everything else is frozen, so some programme must work for it to be classed as 'working'..
Please keep the Windows classic (9x/2000) look and feel.
The layman's guides to - debugging - bug reporting - compiling - with some complementary scripts.
They may help you with a problem, so do have a look at them.

Webunny
Posts: 1201
Joined: Sat Apr 28, 2012 1:30 pm

Re: ROS testing on real hardware

Post by Webunny »

milon wrote:
middings wrote:
Webunny wrote:(W)e could establish the rule that working builds are placed in green, not-working on(es) in red, and partially working ones in orange letters.
I'm puzzled about what "working" means in this context. A set of defined, objective criteria that is more specific than "working" and "partially working" might describe the results in a much more useful and consistent manner.
I'm kind of with middings on this. The definition for "working" will change over time as ROS improves. It may also change if/when significant core changes are made that introduce regressions or unearth nasty bugs. Personally, I would simplify it down to 2 categories: Working (at minimum it boots to desktop) and Not Working (fails to reach desktop). The comments section can be used to elaborate on anything else.
Well, that's basically what I said. Apart from the 'partially working' as a category on itself, but I wasn't really fixed on that, it was merely because with software testing, they had that kind of similar levels.

And I don't think it will change that much: it runs or it doesn't. That's exactly why I would want to keep it simple, because if your talking whether a particular application or (sub)system works on ROS, it becomes a whole other ballgame, because then it could be a problem specific to that application or subsystem, and not ROS as OS.

You already see that on some comments, which are saying 'no sound' or something to that regard. but, well: I don't have sound neither. Until I manually install the drivers, and then I do get sound. It's not REALLY a fault or an error of the OS, thus. I would say the OS is working just fine, in that case, and if someone wants to elaborate on things, he can do so on the comments-section and/or with a reference/link for further details.

Anyway, it isn't all that problematic. When it works, it means it boots, gets you to the desktop, and the system is responsive (aka, you can open the pre-installed applications, for instance). When it hangs or get a BSOD or doesn't you to the desktop at all, that constitutes 'not working'.

If people are unclear about 'partially working' - and indeed, it isn't clearly defined - maybe we should just drop it as a general confirmation, and limit it to whatever problems arose. In that case, if it works it would be like a "yes, except..." and if it doesn't, it could be a "no, but...". thus, if it boots and gets you to the desktop, but the sound doesn't work because you didn't install the drivers, it would be a 'working', and one could fill in the rest in the comments.

A boot on the first but not the second stage, for instance, would be a 'not working', and you could mention it went and worked up until the first boot stage.
Black_Fox wrote:I don't have any HW at hand now, so pardon my intrusion: How about "working" means "same as when ran in VM", "partially working" means "at least boots", while non-working is even worse than that?
No, it's specifically meant for HW testing. And as shown with my own regression-testing, there are bugs/regressions that are only noticeable with real HW. Of course, nothing stops you from making a wikipage specifically catered to VM testing... though... basically that would be the ordinary testing that is done (majority uses VM, after all). The majority of regression testing IS done with VM, and testing a build of ROS in the same VM version, would be the same for everyone; that's exactly the strength ANd the weakness of VM-testing. So it would only boil down to who's the first to test, and all the rest could only confirm it.

But, well, anyway...as said, the wiki is open for anyone. But the page on https://www.reactos.org/wiki/PC_ROS_Rigs *IS* specifically meant for real hardware testing, not VM.

Which isn't saying that VM testing doesn't need to be done anymore (I use Virtualbox/putty myself), it's just that the main purpose of that wikipage *is* about HW-testing.
Last edited by Webunny on Mon Jan 06, 2014 10:14 pm, edited 1 time in total.

milon
Posts: 969
Joined: Sat Sep 05, 2009 9:26 pm

Re: ROS testing on real hardware

Post by milon »

Webunny wrote:...I'm not sure if you can work with colors (for the characters) on the wiki? Otherwise, we could establish the rule that working builds are placed in green, not-working once in red, and partially working ones in orange letters. Also...maybe we also would have to agree that for good and non-working builds, we only place the first and the latest one(s), with a date, so we can get an impression of how ROS evolutes over time.

Also, maybe the " https://www.reactos.org/wiki/PC_ROS_Rigs " should be place in a more obvious (sticky thread?) place and hardware-testing related thread.
I did some Google work, and our wiki is rather similar to Wikipedia. We can use the same formatting that Wikipedia does: http://en.wikipedia.org/wiki/Help:Using_colours
They have 2 ways of adding color (span tags, and templates). We don't have a template setup, but that should be simple enough. The ROS Rigs page now has a color sample added to the table. I don't see a simple WYSIWYG interface for color - it's manual inline coding work.

User avatar
Black_Fox
Posts: 1584
Joined: Fri Feb 15, 2008 9:44 pm
Location: Czechia

Re: ROS testing on real hardware

Post by Black_Fox »

Webunny wrote:No, it's specifically meant for HW testing. And as shown with my own regression-testing, there are bugs/regressions that are only noticeable with real HW.
If there's a ROS bug that blocks something from working EVERYWHERE, then you don't need real HW to find it. Comparison to VM would find bugs that only happen on real HW (or only in VM).

Webunny
Posts: 1201
Joined: Sat Apr 28, 2012 1:30 pm

Re: ROS testing on real hardware

Post by Webunny »

Black_Fox wrote:
Webunny wrote:No, it's specifically meant for HW testing. And as shown with my own regression-testing, there are bugs/regressions that are only noticeable with real HW.
If there's a ROS bug that blocks something from working EVERYWHERE, then you don't need real HW to find it. Comparison to VM would find bugs that only happen on real HW (or only in VM).
Your first sentence seems irrelevant. RH-testing is not specifically meant to find bugs that block everything and anything, including VM. It will find those bugs too, of course, but the greatest interest lays in finding bugs that don't show up in VM. Furthermore, I find the first part of your second sentence very doubtful, unless you are ONLY talking about being able to compare whether it's there in VM too: but VM testing is already much more done and widely tested, so this is already easily noticeable. You could be right in your last part; there might be bugs only related to VM. But since ROS is, ultimately, meant to be run on HW, bugs that are only VM-specific have no relevance in regard to real hardware testing, which is exactly why I think there is a clear difference between the two.

But rest assured: I fully support your endeavour if you wish to make and maintain a wikipage for VM testing. The only thing I was saying was, that that particular page was deliberately and specifically made for HW-testing, and not for VM testing.

Webunny
Posts: 1201
Joined: Sat Apr 28, 2012 1:30 pm

Re: ROS testing on real hardware

Post by Webunny »

milon wrote:
Webunny wrote:...I'm not sure if you can work with colors (for the characters) on the wiki? Otherwise, we could establish the rule that working builds are placed in green, not-working once in red, and partially working ones in orange letters. Also...maybe we also would have to agree that for good and non-working builds, we only place the first and the latest one(s), with a date, so we can get an impression of how ROS evolutes over time.

Also, maybe the " https://www.reactos.org/wiki/PC_ROS_Rigs " should be place in a more obvious (sticky thread?) place and hardware-testing related thread.
I did some Google work, and our wiki is rather similar to Wikipedia. We can use the same formatting that Wikipedia does: http://en.wikipedia.org/wiki/Help:Using_colours
They have 2 ways of adding color (span tags, and templates). We don't have a template setup, but that should be simple enough. The ROS Rigs page now has a color sample added to the table. I don't see a simple WYSIWYG interface for color - it's manual inline coding work.

Interesting. I had a look at your examples, and I think the red letters are more easily readable then the red block. It's something like that which I envisaged when I talked about using a colour-scheme indeed. What specific suggestion of its usage would you recommend? I don't think it opportune to let everything be coloured in... only (the names/numbers of) the builds, then? Red if they were not working, green if they were? It would make it more easily to see at a glance which worked and which not, indeed.

User avatar
adrian15
Posts: 4
Joined: Fri Jan 03, 2014 4:23 am

Re: ROS testing on real hardware

Post by adrian15 »

Hello,

I have Cadox from 1998 year PC which happens to have added a 5"1/4 floppy drive and usb connections. The keyboard plug is serial. I suppose I'll add an ethernet card, probably a realtek one to have networking on it.

I want it to be part of official ReactOS testing system (otherwise is kind of useless), kind of controlling the computer on/off with a third device and using a low consumption second computer (maybe raspberry pi) which connects to the pc thanks to its serial port and with a usb-serial converter I suppose (I currently use it for the mouse :) !) so that ReactOS devs can debug why it doesn't boot at all. Just add a PXE boot so that either installation cd, live cd or Gnu/Linux Burn OS (Just any Gnu/linux to burn the ISO to the CD-RW in an automated way and then reboot) can boot. Is this the right place to ask for this? Or should I ask in the development mailing list?

In the other hand I have edited the Real Hardware Testing (rig) wiki page:
  • Added my 1998 year Cadox computer to the list
  • Implemented the colours on working builds columns. (I know you are discussing about them but I couldn't help just trying them. Just re-edit them if they seem to be ugly).
  • Added a notice to invite people who see the wiki page to come here (this topic) and to detail their experience (or at least link here to their new topic). As I am not English native speaker probably needs some rewriting. Again feel free to improve it (or remove it if it was a bad idea).
For my Cadox computer I would like to try to use serial port to connect to real hardware and be able to report logs of why ReactOS fails. I think that I should check: https://www.reactos.org/wiki/Debugging.

I'm not very sure but I might write a wiki page about testing on real hardware with some links to FAQs (with the UNIATA, ATAPI trick), Debugging page, rebuild ISO image with genisoimage (new page) and Real Hardware Testing (rig) page. I haven't found one page like that in the wiki. The idea is: I want to test ReactOS in a real hardware. Where do I begin? What I should expect to be different in my testings than the virtual machine testings?. You know, not trying to replicate what's already there for normal testing but just real hardware specific stuff. The idea is to try write down what I have learnt these days while I tried to perform my tests on my own so that it's useful for another people.

adrian15

Post Reply

Who is online

Users browsing this forum: Bing [Bot], Google [Bot] and 12 guests