Leveraging Kobiton as a Manual Tester

Okay. So here we have the Kobiton portal. We have a wide variety of functionality displayed to the left here, but most importantly is our device management, our DeviceLab management that we provide. So by clicking on devices I mentioned, we have our public cloud of devices where we have working on getting over more than 300 different devices and operating systems available to anyone. Kind of pay by the minute type of scenario to help with your manual testing.

Say there's a scenario that in post-production has been brought to your attention that there is a defect specifically to this device, specifically to this operating system that you might not have in your own arsenal of devices. You might then navigate to the public cloud of Kobiton search and find that device specific or close to what it is you're looking to test this edge case if you're looking to test and you may do just that from a manual perspective.

We also offer private and local devices. So private devices that are dedicated to you specifically that we can either house for you in our own DC or we can set you up locally to provide these devices for you that once again, you and your team, only you and your team may use these devices. So want to go into the wide range of availability for manual testers that Kobiton offers.

Okay, so let's go to launch a device. I'm going to go ahead and launch this Galaxy S10. Something to note is that Kobiton does have the fastest devices on the market. We are able to achieve this with what we call our lightning mode. So with our lightning mode, where essentially we have our back end set up with an additional dpu server where we're able to achieve 30 frames per second, and this reduced latency.

So once again, we have very fast and responsive remote devices. One of the biggest impacts of remote devices for manual testers is that log input because when you come to Kobiton when you're coming to a device log manager and you have specific test cases with specific test steps they are looking to perform, if that test step is lagged or not as responsive, the user themselves might think, Oh, did I actually click that button? Did I actually perform that test step? And they will actually do that test step twice that it could impact their manual testing. And also it can be quite frustrating as well when you're looking to get a subset of testing done and you're at the hands of slow and unresponsive devices, it almost seems counterproductive.

So for Kobiton, once again, we have very fast and responsive devices available to our users so that manual testing can be efficient and as productive as it needs to be so that you can move on to the next part of the never-ending checklist of testing that you need to perform.

So this is what we call a manual session. And within Kobiton, using a Kobiton device, this is a real device that is being rendered through your browser that you can further interact with in multitude of ways. To start, I'm going to do a quick overview of a manual session.

So with that, I'm just going to go launch Chrome on this real device. And I'm just going to navigate to Kobiton.com. Some really important information to start with is that once again, this is a real device and that this real device, you will interact with this device as if it was same as if it was in your hand, but now you're accessing it through your browser. So instead of swiping, it is still just a matter of. Holding down your mouse and swiping up.

Can get that swiping down. You can swipe left or right, but I know Kobiton.com doesn't have any functionality left or right and we can even long press as well by holding that down. And let me get that there you go. So by long pressing. So on that image itself, specifically, you can save that image, you can download that image, whatever is needed for you. We can support long press. Okay. Let's go to, I'm going to go to maps for a moment. So Kobiton we are located in Atlanta. So here are devices located in Atlanta.

So this showcasing the Atlanta area. I want to show a couple of things that you're able to do within a manual session. So before I get ahead of myself on the functionality, I would like to just give once again a quick overview. So here at the top, you see it's a manual session. Here we have the session information.

So what is the session ID that is currently being recorded at the moment? The device that you're using as well as the operating system. Any additional information such as the UDID, the IMEI or the serial number associated with this device? We have all the information for you within the session info at the top here. And then to the right. We have what we call our real-time data panel.

So real-time being that we offer the device log, we offer device inspector as well as device metrics in real-time. So as you can see, our device log here is already populating a lot of events pertaining to the device and the application and use. We offer device metrics regarding real devices. So this is the memory your application is using the CPU and then the network information as well as as I mentioned previously about battery drain.

So as I like to say, in a manual session, our device metrics is really kind of picking up the exhaust of your application is putting off on a real device. You may then further use in your analytics and your metrics to ensure that your application is performing the most optimal way it can perform. And it's not draining your battery. It's not taking up too much CPU power. It is working as expected. And if not, you can further optimize it.

We do offer a device inspector. And actually, I'm going to navigate back to Chrome for this. So our device inspector works very much like, say, Chrome Inspector or Appium Inspector, where it will capture the entire xml or hierarchy of your mobile application. So all the objects associated with your application and all the objects attributes associated again in that hierarchy. This is fantastic for your automation, for automation engineers who are looking to get started on automating any test case where you have the ability to inspect an element here. Simply hover over where it might be uncertain buttons on the application itself. And it will then highlight the purple where you can then show the attribute of it. And very easily quickly copy the XPath associated with it or view any additional information such as the class or css ID. So once again, this is the real-time data panel.

The logs are able to download, you can pause or clear the logs as well. And also sort based on the event that is populating on the device log itself. Here to the left, we have the control panel. So the control panel just, again, offers a multitude of functionality that you can perform on this real device. So currently we have it set at touch. But you can also do any pinch zoom. You can set the device location as well as set the device timezone.

We do have for Android devices. We do have the ADB shell available to you for any needing to navigate into any debug mode. And then any gestures. So we do offer the basic gestures of a real device. So again, swiping, scrolling, and tapping. But if you're in need of any specific gesture, we do have the ability to create custom gestures and any way that you see fit to perform your manual testing on your application.

I know a good use case for this is especially for Android devices, is that if you need to perform a double tap or something of a unique gesture in a way that would then bring your application into debug mode. Yeah. Even further. Here we have our screenshot control. So within every manual session you are you do have the ability to take screenshots.

So say I want to take a screenshot of Kobiton's home page. I can do just that. I see the number has now incremented to one. If I were to view those screenshots, I can do that as well and then download it locally to my computer. For iOS we do. And for both Android and iOS, we also have what we call image injection as something to if your application needs any functionality using the camera, you may actually inject the image into your application using a remote device to further use that as within your manual testing checklist. If you need to test through the camera functionality of your application.

Kobiton does offer image injection for Android and iOS and that can be shown at a later time in a later course. Even further, for iOS, we do have accessibility testing functionality for our real devices as well. That includes iOS voiceover where you can perform that on a real iOS device. But once again, that can be shown at a later time. For now, I just want to show you would be other manual sessions such as Pinch and Zoom, device, location, and device time. If we were to navigate from touch to pinch zoom, you see that there are these yellow circles that appear where you can expand them out.

And it will zoom in, but I think it might be zoomed in at the. So once again, you can zoom in further or you can zoom back out. For device location, you are able to set a device location to anywhere in the world. Be searching for it. So let's see if I want to do this and let's say I want this device to be in London. I can set that location. You can also set it via latitude and longitude as well. So that was a quick example of again the wide range of functionality you can perform on a real device using Kobiton on these cloudify devices. So once again, we have the ability to set device location, set the time zone, and any further functionality that you might need pertaining to your manual testing pertaining to whatever test case that it is that you need to test. We have the functionality and availability to do just that once again on a real device. Okay.

Opening to navigate back to touch. Something I want to note is that. This is all fine and dandy being on a web-based application. So being on Chrome, we do have the ability to also install native mobile applications, any APK file or IPK file. So with Android, there would be an APK file that we have our application repo that you may upload your APK file to and then you can install applications from there. That is one way to install an application that is the most useful way, just given the different versions that you might have that you're testing through. We do support the ability to also download the application from any store, from the play store, or from the App Store. So there is that availability of installing the application also. So once again, when you have installed your application file to our app repo, you may access that application within a manual session by clicking the install app here.

So once again, you're able to search for application. So let's search for this application and then you may install it on the device. Depending on how large your APK or IPK file is, the first initial download might take just a moment. But in Kobiton we do establish any caching in place as well. So once the application has been installed the first time, the second time, it will take a shorter time.

So here I have the application wingman successfully installed on this Android device, this real device where I may then begin my manual testing. And even further adopt into any scriptless automation as well, and we'll get into that in the next section session. So say this an application that I am looking to manually test. Let's say for the sake of this example, is essentially just wanting to ensure that navigating the next button will take me to the next page and so forth. So these buttons are not broken. I may hit next. The next. And next.

And I see that as then brought me to the login page where I can log in with my phone number and email. So my initial test case of just navigating through the preliminary next pages is working as expected as the next button has retained its functionality. I've just tested the functionality of it, and that completes my test case.

My manual test case. From there, I just want to note how once again, Nova, our artificial intelligence engine, is capturing all the metadata and information on the manual session that I am performing. So. It is once again capturing every top point that I do. Every element that I'm interacting with it is capturing the technical identifier associated with that element so that when I exit this manual session. I can further automate this on multiple devices. Any time.

Nova is unsure of what it is that you're looking to interact with. Previous to running scriptless automation. Nova will bring ask you to help Nova out, essentially annotating what it is that you looking to click. So for this example, I'll just hit I think it was just a general area I was tapping and I'll hit submit. Do the same for a couple of other ones.

So once exiting a manual session, you are brought to our session overview as well as our session details page. But really quickly, I want to show you how once again, running that one manual session, I can then automate and run this across multiple devices. 

Comments are closed.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}