Where should I start with an Integration Test for a Legacy Software in .NET?

Currently, I’m planning a new project of CI/CD with Azure DevOps (with Git, already committed) for an old Solution, which contains 17 C# projects.

Technically, we have access to the source code and it’s required to write all Unit Tests (since it wasn’t planned with them); however, as it’s advice in this article:

Integration Testing made Simple for CRUD applications with SqlLocalDB

The best solution is to perform Integration Tests and not Unit Tests for several reasons:

  • It has been for a considerable amount of time in the market without major changes (Legacy System), more than 5 years.
  • There is not enough documentation of the entire system.
  • The support is limited to minor bugs.
  • It integrates several technologies like C#, SQL, ASP MVC, Console, SAP, etc.
  • Most of people involved in this project are not working anymore; therefore, the business logic is minimal.
  • There would be thousands of cases to evaluate, which means a considerable amount of money and time.

I’d like to know if someone has any experience related or some advice of how to perform them, what way should you follow?

In my case, I’d like to focus specifically in the business logic like CRUD operations, but what should this involved? A parallel database for storing data? Any specific technology? xUnit, nUnit, MSBuild? Or how would you handle it?

P.S.

I see a potential issue with the previous article since it’s using SQL Local DB and I read that is not supported in Azure and probably it’s the same in Azure Dev Ops.

Legacy deep-inheritance XML schemas: how to design patterns for APIs that map those to and from flat schemas?

Consider a purely hypothetical legacy proprietary library for XML models, which has some really deep nested inheritance within its corresponding POJOs — 1-10 fields per class, lots of special instance classes that extend archetypal classes, types as wrappers of lists of type instances etc. The resulting model looks really pretty with some dubious performance specs but that’s besides the point.

I want to make this work with the ugly, high-performance flat models that kids these days and people that claim not to have a drinking or substance abuse problem prefer for some reason. So say my beautiful, shiny model is something like this:

<QueryRequestSubtypeObject>   <QueryRequestHeaders>     <QueryReqParams>       <Param value = 4/>       <ParamDescWrapper index = 12>          <WrappedParamDesc>Foobar</WrappedParamDesc>   ... 

And the corresponding object as modeled by the vendor instead looks like

{    paramVal = 4    paramTypeIndex = 12    paramDesc = "Foobar"     } 

There are also regular updates to this ivory Tower of Babylon as well as updates to the business logic as vendor specs change.

Now the part where I convert my ancient classic into a teen flick is straightforward enough, however ugly it might be. Say something like below would be used by a query constructor and that would be enough abstraction for all business logic involved:

def extractParamVal(queryRequestSubtypeObject):     return queryRequestSubtypeObject.getQueryRequestHeaders.getQueryReqParams.getParam.getValue 

Alas, it is not that simple. Now I want to convert whatever ugly, flat, subsecond latency response that comes back into our elegant, delicate model (with 5-10 second latency, patience is a virtue after all!). With some code like this:

queryRequestSubtypeObject = new QueryRequestSubtypeObject queryRequestHeaders = new QueryRequestHeaders queryReqParams = new QueryReqParams queryReqParamList = new ArrayList param = new Param param.setValue(4) queryReqParamList.add(param) queryReqParams.setQueryReqParamList(queryReqParamList) queryRequestSubtypeObject.setQueryRequestHeaders(queryRequestHeaders) ... 

Code like this needs to be somewhere somehow for each and every field that is returned if someone were to convert data into this hypothetical format. Some solutions I have tried:

  • External libraries: Libraries like Dozer use reflections which does not scale well for bulk mapping massive objects like this. Mapstruct et al use code generation which does not do well with deep nesting involving cases like the list wrappers I mentioned.

  • Factory approach: Generic response factories that take a set of transformation functions. Idea is to bury all model specific implementation into business logic based abstractions. In reality this results in some FAT functions.

  • Chain of responsibility: Methods that handle initialization of each field and other methods that handle what goes where from vendor response and some other methods that handle creation of a portion of the mapping and some other methods that handle a sub-group… loooooong chains of responsibility

Given all of these approaches resulted in technical nightmares of some sort, is there an established way to handle cases like this specifically? Ideally it would have minimal non-business logic abstractions involved while providing enough granularity to implement updates and have it technically solid as well. Bonus points for the ability to isolate any given component, wherever it might be in the model hierarchy, without null pointers getting thrown somewhere for unit testing

As is analysis of legacy systems

I’ve been tasked with performing an ‘as is’ analysis of a monolithic legacy system in my organisation. I’ve been conducting interviews with the technical team responsible for developing the system along with other techniques (such as but not limited to, observing people using the system, surveying, meetings etc), and I’ve been creating data flow diagrams to map out the system.

As I’ve never conducted an ‘as is’ analysis of a legacy system, what is the best way to model the system? I’ve been creating DFDs although I’m begging to question if this is the most appropriate approach.

trying ti install 18.04.2 on a Dell Latitude E6500 with legacy BIOS

I ran Ubuntu 18.04.2 from the “live” disc with no problems. The installation appeared to go without a hitch – TWICE. However, after the installation completes, it will not boot. All I get is a blinking cursor up in the left hand corner of the screen. I followed instructions for boot repair. That also seemed to run fine from the “live” disc, but it did not solve the boot problem. From what I can tell, the Dell Latitude E6500 does not have a UEFI mode. There were some articles I read that suggested that THAT was the problem, but they offered no solution. Can someone point me to a comprehensive article on how to install Ubuntu on the Dell Latitude E6500. Perhaps there is a different version that WILL install. THANK YOU.

Frank Fiamingo

how to interact with legacy database that its tabels dose not have some fields

I am working on project with spring boot, hibernate and SQL-Server. for some reasons Some of the project database tables should be synchronized with the legacy database tables that does not have some necessary fields …

The question is, what is the solution to this situation and whether there is the best practice for this situation??

The solution I’m thinking of is to create a new table for fields that do not exist in main table, with a reference to the main table and use join to select.

How do I write unit tests for legacy code (that I don’t understand)?


Forward

I’ve read a lot of things before asking this question, including many relevant questions right here on SE:

  • (Software Engineering SE) Writing tests for code whose purpose I don’t understand
  • (Software Engineering SE) Unit testing newbie team needs to unit test
  • (Software Engineering SE) Best practices for retrofitting legacy code with automated tests
  • (Software Engineering SE) How to unit test large legacy systems?
  • (Blog post) How to mock up your Unit Test environment

However, I can’t help but feel that the itch hasn’t been scratched yet after reading for help.


TL;DR

How do I write unit tests for legacy code that I can’t run, simulate, read about, or easily understand? What regression tests are useful to a component that presumably works as intended?


The Whole Picture

I’m a returning summer intern again as I’m transitioning into grad school. My tasking involves these requirements:

  1. For a particular product, evaluate whether our software team can upgrade their IDE and JUnit version without losing compatibility with their existing projects.
  2. Develop unit tests for some component in the existing Java code (it’s largely not Java). We want to convince the software team that unit testing and TDD are invaluable tools that they should be using. (There’s currently 0% code coverage.)
  3. Somehow, end the days of cowboy coding for a critical system.

After obtaining a copy of the source code, I tried to build and run it, so that I might understand what this product does and how it works. I couldn’t. I asked my supervisors how I do, and I was issued a new standalone machine capable of building it, including the build scripts that actually do. That didn’t work either because as they should’ve expected, their production code only runs on the embedded system it’s designed for. However, they have a simulator for this purpose, so they obtained the simulator and put it on this machine for me. The simulator didn’t work either. Instead, I finally received a printout of a GUI for a particular screen. They also don’t have code comments anywhere within the 700,000+ Java LOC, making it even harder to grasp. Furthermore, there were issues evaluating whether or not their projects were compatible with newer IDEs. Particularly, their code didn’t load properly into the very IDE version they use.

My inventory is looking like this:

  • NetBeans 8, 9, 10, 11
  • JUnit 4, 5
  • Their source code for a particular product (includes 700,000+ Java LOC)
  • Virtually no code comments (occasionally a signature)
  • No existing tests
  • A physical photo of a GUI window
  • A software design document (109 p.) that doesn’t discuss the component in the picture

I at least have enough to theoretically write tests that can execute. So, I tried a basic unit test on this said component. However, I couldn’t initialize the objects that it had as dependencies, which included models, managers, and DB connections. I don’t have much JUnit experience beyond basic unit testing, so follow me to the next section.


What I’ve Learned From My Reading

  1. Mocking: If I write a unit test, it likely needs to have mock variables for production dependencies that I can’t easily initialize in setUp.
  2. Everyone here liberally suggests the book “Working Effectively with Legacy Code” by Michael Feathers.
  3. Regression tests are probably a good place to start. I don’t think I have enough weaponry to attempt integration testing, and regression tests would provide more instant gratification to our software team. However, I don’t have access to their known bugs; but, I could possibly ask.

And now an attempt to articulate the uncertainty I still have as a question. Essentially, I don’t understand the how part of writing these tests. Assuming I don’t receive any further guidance from my supervisors (likely), it’s in my ballpark to not only learn what this component does but to decide what tests are actually useful as regression tests.

As professionals who’ve worked with projects like this longer than I have, can you offer any guidance on how to write unit tests in this kind of situation?

Unable to access GRUB in Win10/Ubuntu18.04 legacy dual boot

I’ve installed Windows 10 Pro & Ubuntu 18.04 on primary partitions on a Dell Opti-Plex 7010. I made sure to use legacy BIOS for OS installations on account of having partitioned the hard drive using Ubuntu Live’s GParted (MBR). After finishing the Ubuntu installation the PC doesn’t enter GRUB when booting, it just boots Windows. I think I created the 18.04 bootable USB about a year ago from 16.04, I don’t quite remember the settings (if there were any). I don’t know if this should make a difference when I installed this attempted dual boot selecting legacy BIOS.

(I tried to tag this question “legacy-bios”, but the tag doesn’t seem to exist.)

Right way to replace legacy system

I’m thinking about rewrite legacy subsystem to golang microservice. But first I need to write some acceptance tests.

In my opinion, the process should look something like this:

  1. Write acceptance tests
  2. Run it on legacy system
  3. Write microservice and make sure that the tests pass

What are the solutions and best practices in the golang ecosystem?

I am planning to dual boot my computer with Windows 10 and Ubuntu on Legacy mode

which should I use for legacy Rufus or Universal USB installer? Or does it even matter? I had a dual boot system of ubuntu and 8.1 but I fresh installed windows 10(used rufus to create bootable usb selecting MBR not GPT) and the system boots straight to windows (no GRUB menu), so i think my ubuntu was in uefi or something so I am planning to fresh install Ubuntu as well..