fix: make merge base fetching more robust

Several static check scripts rely on identifying the Git merge base via
get_merge_base, which may fail if the local Git history is too shallow.
This causes issues when checking patches on CI or LTS branches with
limited fetch depth.

To address this:

- Reuse a single evaluated merge_base instead of invoking get_merge_base
  multiple times.
- If merge_base is not found initially, attempt to fetch more history
  from the remote using the appropriate GERRIT_REFSPEC.
- Exit with an error if a merge base cannot be found even after
  fetching.

This improves reliability of static checks when working on partial
clones or CI environments where fetch depth is restricted.

Change-Id: Icccec0eb2f29d254e54bbd6b639f6c1ef11291a3
Signed-off-by: Harrison Mutai <harrison.mutai@arm.com>
diff --git a/script/static-checks/static-checks-detect-newly-added-files.sh b/script/static-checks/static-checks-detect-newly-added-files.sh
index 5b85ce6..008a147 100755
--- a/script/static-checks/static-checks-detect-newly-added-files.sh
+++ b/script/static-checks/static-checks-detect-newly-added-files.sh
@@ -60,7 +60,7 @@
   echo "# Check to detect whether newly added files are analysed by Coverity in the patch"
   TEST_CASE="Newly added files detection check for Coverity Scan analysis on patch(es)"
 # Extracting newly added source files added between commits.
-  git diff $(get_merge_base)..HEAD --name-only --diff-filter=A "*.c" &> "$TFA_PATCH_NEWFILES_LIST"
+  git diff ${merge_base}..HEAD --name-only --diff-filter=A "*.c" &> "$TFA_PATCH_NEWFILES_LIST"
   if [ -s "$TFA_PATCH_NEWFILES_LIST" ]
   then
     file_updation_report