# Maximum Sum SubArray – Naive Implementation

I am looking at a problem that is often asked in interview settings. The problem statement is:

Given an integer array nums, find the contiguous subarray (containing at least one number) which has the largest sum and return its sum.

I want to talk about the naive solution, where we generate all contigous subarrays, take their sums, and find the maximum sum. Here is an implementation of the naive solution:

``function maxSubArray(arr) {   let max = null;    for (let i = 0; i < arr.length; i++) {     const startIdx = i;      for (let j = i; j < arr.length; j++) {       let sum = 0;        for (let k = i; k <= j; k++) {         sum = sum + arr[k];       }        if (max === null || max < sum) {         max = sum;       }     }   }    return max; }  ``

I know that this solution runs in `O(n^3)`, but I do not understand why that is the case. If we “generate” all possible contigous subarrays, I can explain why that is an `O(n^2)` operation:

Let’s say that our input array is 4. We can create 1 contigous subarray of length 4; 3 contigous subarrays of length 3; 3 contigous subarrays of length 2, and 4 contigous subarrays of length 1. In other words, the number of contigous subarrays for an array of length `n` is the sum of 1 to `n`, or (n * (n + 1)) / 2. In asymptotic analysis, that is `O(n^2)`.

Now, I understand that summing up each individual subarray is **not* free, but here is where I am stuck. I do not know how to “add the work that we do to get the sum of each subarray” to my existing time complexity of `O(n^2)`.