Category: Portfolio Management Systems


I don’t know who we have to thank at Intuit for the relatively new appearance the invoice emails, but like most of the people I work with I am in the professional services business, doing my best to be professional. Sending out invoices using QuickBooks’ new invoice emails makes that difficult.  Invoices sent to clients are a recurring touchpoint and an opportunity to create an impression – good or bad.  Intuit’s invoice emails are making everyone using them look like clowns.

Two clowns wearing colorful costumes and clown makeup laughing in front of a computer screen displaying a software interface.

I don’t need an invoice email that tells my clients that “Your Invoice is Ready!”  Why else would I send an email with their invoice? I also don’t need to tell my clients to please pay their invoice on time.  Nor have I ever considered centering the text of an email with a salutation and closing. I can see Intuit’s name on my email, but where is my company name? Why is the day of the week the invoice is due included? That’s superfluous. And last, but not least, why would I want overcooked pea-soup to be the background of my email.  Who on earth is responsible for this?  Was AI used to assist? I think so.

An invoice notification titled 'Your invoice is ready!' showing invoice number 38323, due date March 27, 2026, and a balance of $1,000.00, with a 'View and pay' button and payment method icons. The message is addressed to 'Steve' from 'Joe', thanking him for his business.
Intuit QuickBooks’ New Invoice Email

If your firm is still using QuickBooks Desktop or Enterprise, you may be frustrated by the look of the new invoice emails that were implemented and locked-down to some degree regardless of what you might want them to look like.  I know I was.  Towards the end of last year, the format of the QuickBooks invoice emails changed suddenly and without warning.  Look, these emails were never awe-inspiring, but they had a semi-professional … not too awful look that we could work with.

I’ll admit that I sent out a couple of the newer invoice emails with some edits to prevent me from appearing like a complete clown, but it was still embarrassing.  In my experience, the invoices reverted back to the old format the next month and I breathed a sigh of relief to regain the stale, but professional-looking invoices I have sent via email for years.

Unfortunately, that relief was short-lived. In February, the invoice emails changed back to the god-forsaken format again (minus the part about paying the invoice on time), and now, if didn’t do something to address the issue, my invoicing was going to look even worse.  Not only would I appear to have no design or business sense, but now our invoice-related buffoonery was going to include vacillating between multiple invoice email formats every month.  I called Intuit to air my frustration and hoped that they could help resolve the issue.

It turns out I wasn’t alone.  There were a lot of people that were unhappy about their inability to control the format of the invoice emails, and Intuit confirmed it.  We discussed the issue at length, and the representative told me there was nothing they could do to fix it … so, I fixed it. And if we were dealing with any software company but Intuit that would probably be the end of the story.


The fix that worked until Intuit changed the email invoicing format again in April. 😐

I created Outlook VBA code to remove the day of the week, eliminate red-orange color on the date, remove the “Your invoice is ready!” and replace it with my company name, add logic to remove the cents (.00), left align the text of the message in the body of the email, change the color of the font and the background, and automatically have it adjust the due date to first day of the following month; then I customized the Quick Access Toolbar to create an icon you can click on to run the automation.  The code below lets you update the invoices to make them appear more professional.

VB
Sub FixQBEmail()

   LeftJustifyInvoicePara
   ProofAndCleanEmail

End Sub

Sub ProofAndCleanEmail()

    Dim oInspector As Inspector
    Dim oItem As MailItem
    Dim oDoc As Object
    Dim sBody As String
    
    ' Get the currently open/active email being composed
    Set oInspector = Application.ActiveInspector
    
    If oInspector Is Nothing Then
        MsgBox "No email is currently open for editing.", vbExclamation
        Exit Sub
    End If
    
    If oInspector.CurrentItem.Class <> olMail Then
        MsgBox "The active item is not an email.", vbExclamation
        Exit Sub
    End If
    
    Set oItem = oInspector.CurrentItem
    
    ' --- Work on the BODY ---
    ' For HTML emails, we modify the plain text representation carefully.
    ' Using Word editor object for rich/HTML body editing is more robust.
    
    If oInspector.EditorType = olEditorWord Then
        ' Email is using Word as editor (default in modern Outlook)
        Set oDoc = oInspector.WordEditor
        Call CleanWordDocument(oDoc)
    Else
        ' Fallback: work on plain text body
        sBody = oItem.Body
        sBody = CleanDollarAmounts(sBody)
        oItem.Body = sBody
    End If
    
    'MsgBox "Email proofing complete!", vbInformation
End Sub


' -------------------------------------------------------
' Cleans a Word document (the email body in Word editor)
' -------------------------------------------------------
Sub CleanWordDocument(oDoc As Object)
    Dim oTable As Object
    Dim oCell As Object
    Dim oRange As Object
    Dim oFind As Object
    Dim i As Integer, j As Integer, k As Integer

    ' First try Find/Replace on the whole document content

    Call FindReplaceDollars(oDoc.Content)
    Call ReplaceText(oDoc.Content, "Your invoice is ready!", "Quartare")
    Call FixDateLineColor(oDoc.Content)
    Call RemoveDayOfWeek(oDoc.Content)
    Call UpdateToFirstOfNextMonth(oDoc.Content)
    'Call ColorDateGreen(oDoc.Content)
    Call FixHTMLBackgrounds

    ' Then explicitly loop through all tables and cells
    For i = 1 To oDoc.Tables.Count
        Set oTable = oDoc.Tables(i)
        For j = 1 To oTable.Rows.Count
            For k = 1 To oTable.Columns.Count
                On Error Resume Next  ' some cells may be merged
                Set oCell = oTable.Cell(j, k)
                If Err.Number = 0 Then
                    Call FindReplaceDollars(oCell.Range)
                End If
                On Error GoTo 0
            Next k
        Next j
    Next i
End Sub

Sub FindReplaceDollars(oRange As Object)
    Dim oRegex As Object
    Dim oMatches As Object
    Dim oMatch As Object
    Dim sFind As String
    Dim sReplace As String
    
    ' Use RegEx to identify what needs replacing
    Set oRegex = CreateObject("VBScript.RegExp")
    With oRegex
        .Global = True
        .Pattern = "\$([0-9,]+)\.00"
    End With
    
    Set oMatches = oRegex.Execute(oRange.Text)
    
    ' For each match, do a safe literal Word find/replace
    For Each oMatch In oMatches
        sFind = oMatch.Value                    ' e.g. $250.00
        sReplace = "$" & oMatch.SubMatches(0)   ' e.g. $250
        
        With oRange.Find
            .ClearFormatting
            .Replacement.ClearFormatting
            .MatchWildcards = False
            .Forward = True
            .Wrap = 1
            .Text = sFind
            .Replacement.Text = sReplace
            .Execute Replace:=2
        End With
    Next oMatch
End Sub

Sub ReplaceText(oRange As Object, sFind As String, sReplace As String)
    With oRange.Find
        .ClearFormatting
        .Replacement.ClearFormatting
        .MatchWildcards = False
        .Forward = True
        .Wrap = 1
        .Text = sFind
        .Replacement.Text = sReplace
        .Execute Replace:=2
    End With
End Sub

Sub FixDateLineColor(oRange As Object)
    Dim oPara As Object
    Dim oWord As Object
    
    For Each oPara In oRange.Paragraphs
        If InStr(oPara.Range.Text, "| Due ") > 0 Then
            ' Loop through each word/run in the paragraph and set to white
            For Each oWord In oPara.Range.Words
                oWord.Font.Color = RGB(50, 50, 50)  ' KS white?
            Next oWord
        End If
    Next oPara
End Sub

Sub FixHTMLBackgrounds()
    Dim oInspector As Inspector
    Dim oItem As MailItem
    Dim sHTML As String
    
    Set oInspector = Application.ActiveInspector
    Set oItem = oInspector.CurrentItem
    
    sHTML = oItem.htmlBody
    
    ' Fix background colors - replace with whatever colors you want
    'sHTML = Replace(sHTML, "bgcolor=""#ECEEF1""", "bgcolor=""#FFFFFF""")  ' outer body
    sHTML = Replace(sHTML, "background:#F4F4EF", "background:#7D99B6")    ' top section
    sHTML = Replace(sHTML, "background:#F4F5F8", "background:#F4F5F8")    ' footer section
    
    ' Fix the red date color while we're here
    'sHTML = Replace(sHTML, "color:#D52B1E", "color:#393A3D")              ' match surrounding text
    
    oItem.htmlBody = sHTML
    
    'MsgBox "Background colors updated.", vbInformation
End Sub

Sub RemoveDayOfWeek(oRange As Object)
    Dim sDays As String
    Dim oDays() As String
    Dim i As Integer
    
    sDays = "on Mon, |on Tue, |on Wed, |on Thu, |on Fri, |on Sat, |on Sun, "
    oDays = Split(sDays, "|")
    
    For i = 0 To UBound(oDays)
        Call ReplaceText(oRange, oDays(i), "")
    Next i
End Sub

Sub UpdateToFirstOfNextMonth(oRange As Object)
    Dim oRegex As Object
    Dim oMatches As Object
    Dim oMatch As Object
    Dim sOldDate As String
    Dim sNewDate As String
    Dim dDate As Date
    Dim dNewDate As Date
    
    ' Use RegEx to find MM/DD/YYYY pattern
    Set oRegex = CreateObject("VBScript.RegExp")
    With oRegex
        .Global = True
        .Pattern = "\d{2}/\d{2}/\d{4}"
    End With
    
    Set oMatches = oRegex.Execute(oRange.Text)
    
    If oMatches.Count = 0 Then
        MsgBox "No date found.", vbExclamation
        Exit Sub
    End If
    
    For Each oMatch In oMatches
        sOldDate = oMatch.Value
        
        ' Parse the found date
        dDate = CDate(sOldDate)
        
        ' Calculate first day of next month
        dNewDate = DateSerial(Year(dDate), Month(dDate) + 1, 1)
        
        ' Format back to MM/DD/YYYY
        sNewDate = Format(dNewDate, "MM/DD/YYYY")
        
        ' Use safe Word find/replace
        Call ReplaceText(oRange, sOldDate, sNewDate)
    Next oMatch
End Sub

Sub ColorDateGreen(oRange As Object)
    Dim oPara As Object
    Dim oRegex As Object
    Dim oMatches As Object
    Dim oMatch As Object
    Dim oFindRange As Object
    
    Set oRegex = CreateObject("VBScript.RegExp")
    With oRegex
        .Global = True
        .Pattern = "\d{2}/\d{2}/\d{4}"
    End With
    
    For Each oPara In oRange.Paragraphs
        Set oMatches = oRegex.Execute(oPara.Range.Text)
        
        For Each oMatch In oMatches
            ' Create a range for just the date
            Set oFindRange = oPara.Range.Duplicate
            oFindRange.Start = oPara.Range.Start + oMatch.FirstIndex
            oFindRange.End = oFindRange.Start + oMatch.Length
            
            ' #D86C00 converts to RGB(216, 108, 0)
            oFindRange.Font.Color = RGB(0, 137, 46)
        Next oMatch
    Next oPara
End Sub

Sub LeftJustifyInvoicePara()
    Dim oInspector As Inspector
    Dim oDoc As Object
    Dim oCell As Object
    Dim oPara As Object
    
    Set oInspector = Application.ActiveInspector
    Set oDoc = oInspector.WordEditor
    Set oCell = oDoc.Tables(1).Cell(1, 1)
    
    For Each oPara In oCell.Range.Paragraphs
        ' Find the paragraph that contains the invoice message
        If InStr(oPara.Range.Text, "Please remit payment") > 0 Then
            oPara.Alignment = 0
        End If
    Next oPara
End Sub
Expand

We also could have eliminated the need to click on button to reformat invoice emails and call the FixQBEmail process from within the existing QuickBooks Outlook integration by intercepting outgoing Outlook email via Application.ItemSend / MailItem.Send event before the message leaves Outlook.

Another alternative to this, if you need a greater degree of control, would be the following workflow:

  1. Batch-create invoices in QuickBooks.
  2. Send them to a mailbox you control.
  3. Use automation to extract the hosted invoice URL from each email.
  4. Generate your own cleaner outbound email using that URL.
  5. Send them out from your own domain/mailbox.

I still believed the invoice emails we sent out before Intuit changed their default appearance were more professional looking, but this code at least succeeded in taking something that looked utterly ridiculous and made it look less so.  I was willing to call it a win at the time. However, Intuit had other plans.


The latest QuickBooks change, which nearly restores the original email invoice format from five months ago, does little to reassure me that the invoice emails generated in the future will look similar. Instead, I am convinced that users like me have absolutely no persistent control over how these invoice emails will look when created directly through QuickBooks and that is not acceptable. The updates to invoice emails are occurring without any permission or approval from QuickBooks users and do not appear to be connected in any way to software updates that users perform.

Invoice notification displaying the amount due of $1,000.00, addressed to Steve, with invoice number #34000 and a request for payment at earliest convenience.
Intuit QuickBooks’ Old Invoice Email

It may be tempting to believe that Intuit has finally resolved this snafu, but after the frustration of dealing with the issue repeatedly over a period of months I doubt many users will be able to trust Intuit further in this regard. Professional services firms need adequate controls on client-facing communications and the only way they will get those controls where Intuit’s invoice emails are concerned is by moving away from QuickBooks altogether or creating their own internal processes to compensate for Intuit’s shortcomings.


Kevin Shea Impact 2010

About the Author: Kevin Shea is the Founder and Principal Consultant of Quartare; Quartare provides a wide variety of agile technology solutions to investors and the financial services community at large.

For details, please visit Quartare.com, contact Kevin Shea via phone at 617-720-3400 x202 or e-mail at kshea@quartare.com.

I have been creating useful content for Advent users and the financial services firms they work with for many years now.  Part of creating the blog is therapeutic for me, part of it is a well-intentioned effort to foster goodwill by sharing the lessons I have learned with other users to reduce pain points, and some part of it is an effort to get the word out about what I do in case users find that they need someone with my unique skillset.

A watercolor painting of a pensive sculpture with a hand on its chin, surrounded by a colorful abstract background.

When ChatGPT and similar services arrived on the scene, I was somewhat concerned about what would happen if those engines sucked up the knowledgebase I have created and presented it to my audience as their own information.  There are documented ways to discourage AI bots like robots.txt, Cloudflare AI Crawl, WAF blocking and CMS-specific settings.  I know this, but early on I made a choice to allow it, rather than fight it.

Initially, I remained most worried that my knowledge would be presented without credit to where it came from, but this past week I realized that there is another issue altogether.  In the process of troubleshooting an Advent Software use case, I queried Perplexity.  I was rewarded with a page of summary information that cited Advent, AdventGuru and 13 other sources.  Some of those sources were relevant, most were not.

As I drilled down on the problem, most of the sources cited were in fact me.  So here I was querying Perplexity for assistance, and it was attempting to assist me in troubleshooting the issue using information I provided.  Some part of this makes sense and could be helpful if I were losing my faculties or wanted to query my own digital footprint related to the issue.  Neither of these apply.

I wound up resolving the issue with the user and their IT consultant in less than an hour with no part of the credit for doing it attributable to using Perplexity with its “best” model. Our solution was collaborative.  The client arranged a Teams meeting with me, and their IT consultant.  The three of us worked together to try a few things.  Eventually, we found a solution as a result of us all working together – not an AI query.  We solved it because we found the time to have a meeting and made that a priority.

In the process of writing this, I ran the same query on Opus 4.6 and GPT-5.4.  The results were very similar. My blog and other online sources of current and historical information are the data that empowers these engines to respond to practical and esoteric questions with anything relevant beyond their training data.  However, as I read though their responses to the query, it became clear that I was heavily cited without any solution being provided to the problem.

While I am flattered that my subject matter expertise is held in such high esteem by AI inference engines, I am concerned that when AI models attempting to utilize what I have written – citing me repeatedly throughout their response as a source of information – fail to provide a solution that reflects poorly on me.  In the particular case we resolved, I am almost 100% certain that a solution does not currently exist online.  I have already written a separate blog post detailing the problem and its solution, but now the question for me is, do I put that solution up on my blog? 

By doing so, I continue to provide access to users that have relied on me as a source of information that may not otherwise be discovered, documented or publicly available, but I am also empowering AI inference to parody my expertise in more meaningful ways that may make users think their favorite AI chatbot is a substitute for getting knowledge directly from the source. The latter is problematic because written works contain meaning and nuance that are lost when information is taken selectively and presented out of context.

Chatbots cannot be trusted to provide the best possible answer – only the best possible answer based on their training, parameters, capabilities, available data, and the prompts we use to query them.  My blog posts are representative breadcrumbs of my experience that I have chosen to share.  In this case, the chatbots reviewed my blogs to determine if and how something can be done, asserted that it could not be done, and then provided instructions on possible workarounds, but there was a solution that could be found without the assistance of a chatbot all along and it makes a lot more sense than the workarounds that were recommended by Perplexity and its cohorts.


Kevin Shea Impact 2010

About the Author: Kevin Shea is the Founder and Principal Consultant of Quartare; Quartare provides a wide variety of technology solutions to investment advisors nationwide.

For details, please visit Quartare.com, contact Kevin Shea via phone at 617-720-3400 x202

Over the years I have published many blogs.  Almost none of them are as frequently visited as Getting Data In and Out of Advent APX and Axys. That is a good indicator that the topic remains relevant, but a long time has passed since it was published. If you have a recent version of APX today, there is another option that was not available back then – RESTful API.   Though I knew the functionality existed within more recent versions of APX, I hadn’t had the chance to implement it with a client yet.

Abstract illustration of interconnected blue and orange lines resembling data flow or network connections on a dark background.

Last year, an APX user approached me, seeking to utilize Advent Software’s API to create data pipelines between APX and their in-house MS SQL Server Data Warehouse (DW).  The APX user identified this work as a prerequisite for them to fully migrate from Axys to APX.  They wanted to maintain the high-level integration they had created between their DW and Axys in APX.

I was somewhat concerned because I had researched previous APX versions related to the API and knew of some of the issues early adopters had encountered. Given that information, I had some trepidation about obligating myself to a project that required implementing the API. At the time I was not convinced, via first-hand experience, that the API would work reliably for their planned application of it.

I tempered the prospective client’s expectations and proposed a flat-rate job focused on determining the feasibility of doing APX API integration in their environment. Our end goal was to develop a couple of data pipelines between the DW and APX, with the caveat that those deliverables, developed in Python and/or JavaScript, would be proof-of-concept work and not necessarily ready for production use.

Together we successfully completed the project in an APX v21.x environment self-hosted by the client. The majority of the work was done over the course of a couple of months and the deliverables were ready to be moved into production almost immediately afterwards, but there were some challenges along the way.  In most of the instances detailed below, we looped Advent in for assistance, and they did a commendable job helping us resolve the issues promptly.

  • Error 403 – Initially, we were getting an error when attempting to use the API.  We reached out to Advent, and they noted that the most recent Cumulative Hot Fixes (CHF) update wasn’t applied and recommended that we install it.  Applying the CHF update resolved the error, and the API worked as expected.

  • Postman functionality – There were a couple of days where Postman was completely unresponsive.  During that brief period, we had difficulty doing even the most basic API testing.  This issue seemed to resolve itself, but we may also have logged out of Postman and logged back in.

  • Error 500 writing data to APX – During dev the functionality to read APX data was working very well, but we found that attempting to write data to APX generated an Internal Server Error.  I assumed that this meant that the data was not being written to APX.  After looping Advent in for another call, we discovered that although the error was being generated, the data was being successfully written to APX. Advent indicated that they would put a fix request in, but it might not happen because v21.x was sunset. With some reservations, I updated my code to ignore the error 500 when we wrote selective data to APX via the API.

If you do reach out to Advent for assistance, make sure you have Postman installed.  Advent has no desire to review your code.  They will want to test the functionality of the API with you using Postman.

Screenshot of a code editor displaying a Python script for updating data via an API. The interface includes files and folders related to API components, test logs, and a main script for API interaction.
Visual Studio screenshot of Python code sample illustrating API use.

What is Required to Get Started with the API?

Utilizing the API requires some detailed set up and work to get up to speed.  It probably won’t be something that just works without some troubleshooting, and there is a bit of a learning curve.  The following list may not cover everything you need to do to get up and running with the API, but it is a good place to start.  I wish there had been a better resource for me when I started working with APX’s REST API.

Here are some tips that should help those interested in implementing the API:

  1. Make sure you are on the latest CHF for your current version of APX.  If the latest hot fixes have not been installed, you may have problems trying to utilize the API.
  2. Download the Advent Portfolio Exchange REST APIs Postman Collection from the Advent Community website.
  3. Create a Postman account if you don’t already have one, and locally install the Postman software.
  4. Load the collection into your Postman profile and review the documentation completely.
  5. Do a search on the API in the Advent community site and read through some of the threads.  The code samples there were simple, but helpful.
  6. Create the client/credential and verify its existence via SSMS.  The client is persistent, so once you have created it, you shouldn’t have to create it again unless you update APX.  Verify the existence of the client (e.g., cc.postman) in the APX dbo.clients table.  If you have trouble creating the client using your code, try using the PowerShell script to create the client.
  7. The user profile you are using needs to have appropriate rights.  Though we escalated my individual user rights in all the documented required areas, I eventually started using the admin user profile, which worked more reliably in our environment.  I believe Advent recommends using the admin user profile if possible.
  8. Test basic APX API functionality in Postman to make sure it works before attempting to create code via C#, Python, JavaScript, et cetera that leverages the API.

Once you have completed the set up required and can use the API to read and write data to APX, you are ready to build out your solution.  If you have trouble with your implementation, validate specific functionality of the API with Postman.

Calling the API

Almost any use case of the API to write or read data from the APX requires the following steps:

  1. Get IdentityServer base address from APX authentication configuration.
  2. Get token endpoint from IdentityServer configuration.
  3. Get token with client_credentials grant type.
  4. Perform whatever API action you want (multiple calls to the API with the access_token are fine).
  5. End your API Session.  The API utilizes one of your APX seats while the session is active.

Those familiar with API use and Python are likely aware that manipulating data can necessitate working with JSON as well as Python dictionaries.  As an example, in order to read data from APX and write data from the DW into APX that is different from what is already in APX, you may need to:

  1. Query APX for the relevant data via the API, which creates a JSON file.
  2. Query the DW for the relevant data.
  3. Load the JSON data received from APX into a Python dictionary.
  4. Parse and compare the APX data from the Python dictionary with the records from the DW.
  5. Add the records that meet the criteria to the JSON payload.
  6. Send a patch request via the APX API.

The following diagram details this workflow.

Flowchart illustrating the data pipeline between Advent APX and a Data Warehouse using REST API, detailing various processing steps and data storage interactions.

To wrap up the project, I created a PowerPoint presentation summarizing and detailing what we did and how it all works to empower the internal development team to understand, troubleshoot, and replicate my work if they need to in the future. I am always available to support the solutions I create, but I prefer that my customers call me because they want to, not because they need to.

Why would you want to use the API instead of IMEX?

There are pros and cons to using the API. It presents an opportunity to use a single unified methodology to integrate data in your environment but may fall short of that depending on the specific needs of your firm.

The pros of using the API include the fact that is is a more modern approach to extracting and importing data at a granular level. The API can be used to pull data such as holdings, select time period performance, etc. In some use cases, APX users are likely extracting and transforming data that they drop into a DW. Some of those transformations, such as recalculating performance figures, may not be necessary when utilizing the API. The API has the potential to be more secure but given that the default password for admin user in APX frequently doesn’t get changed, it probably isn’t any more secure than IMEX in most self-hosted APX environments.

The cons of using the API are that some data elements may still be in flux. Reading and writing certain data points may not be possible via the API, which could force you to use IMEX or other methods (e.g., Replang, public views, stored procedures, SSRS) in addition to the API. It also may be difficult developers that aren’t Advent APX Subject Matter Experts (SMEs) to bridge this gap. Conversely, it may be difficult for those SMEs that are not developers familiar with the API use to implement it on their own.

Using well-established APX import and export methods like IMEX may still be the most efficient and reliable way to import and extract certain data elements from APX. However, going forward, the growing maturity of Advent’s REST API should force tech-savvy management, users, and integrators to ask “Should we be using the APX API to do this?” as they look to forge a modern data stack that integrates APX data, and meets AI-driven demands for more robust data access.


Kevin Shea Impact 2010

About the Author: Kevin Shea is the Founder and Principal Consultant of Quartare; Quartare provides a wide variety of agile technology solutions to investors and the financial services community at large.

To learn more, please visit Quartare.com, contact Kevin Shea via phone at 617-720-3400 x202 or e-mail at kshea@quartare.com.

Image created using AI query for Python code to create word art with Replang keywords.

The State of Reporting Development for Axys and APX Users

Advent users continue to benefit from many different report development options. There is a tantalizing and sometimes dizzying array of reporting options both within Advent’s architecture and provided by third-party solution providers, products and platforms.  In most cases, leveraging the most enticing options takes a commitment of time, money and patience.

At the top, management may envision staff using a single transformative technology that unifies all the data and makes it easier to push, pull or outright access data from portfolio accounting and ancillary systems. However, the truth, at least where Advent is concerned, is that the most effective way of making all those wonderful connections between applications and other data sources is a blended approach using the most effective method for various data elements.  A cohesive strategy and well-organized approach to data gathering and sharing should be implemented, but it is not critical or realistic that all data elements be delivered via one approach or method.

APX users have the ability to tap data from APX’s underlying SQL Server database using a growing combination of data integration options within the framework of APX.  These options include Stored Accounting Functions, Public Views, SSRS and REST API – as well as any other reporting tools and systems that can make use of that infrastructure.  APX users have a lot of capabilities baked into the platform that Axys users don’t have, but from what I typically see out in the wild, most firms using APX aren’t leveraging those features as well as they could.

Evolving Report Development Options for Axys and APX Users

Axys, APX and other portfolio accounting system users, who have taken the time to use ETL tools, like xPort, to populate their own data warehouses, will have similar data schemas focused on the most critical data (e.g., clients, agreements, revenue, portfolios, transactions, performance, holdings, etc.) to their respective businesses.  Depending on firm size and budget constraints, these users may benefit from tapping that data with a visual analytics platform like Pyramid Analytics, Microsoft Fabric or Tableau.

I am excited about the latest emerging tech and currently working with what I see as some of the best platforms and tech available.  Newer tech isn’t going away, but for someone with their feet firmly planted on the ground who needs to generate a relatively simple report today, it probably makes sense to hit the snooze bar momentarily and attempt to do what needs to be done now.  Though it may appear outdated by comparison, Axys and APX users can also create reports using Report Writer Pro or via updates to Replang source code directly.

While advanced reporting tools can be extremely powerful and, in fact, instrumental for some types of reporting requirements, I am a fan of Occam and his razor. In many cases, there is just no need to complicate reporting any more than is useful to accomplish the end goal. Replang, which was established in Advent Software’s infancy, is still very much part of the reporting architecture of Axys and APX and will likely remain part of it forever.

Like many Advent users out there, I have used Notepad and/or Notepad++ to modify Advent Axys, APX and Report Writer Pro reports. I was modifying these files via the MS-DOS Edit command way back when they were part of The Professional Portfolio. Any of the tools are sufficient, but plain old Notepad and Edit don’t even display line numbers; Notepad++ is a step in the right direction, as it provides line numbers and the ability to use plug-ins, but neither option could be considered a modern tool for source code modifications.

Visual Studio Code

That’s where Visual Studio Code (VSCode) comes in. VSCode, which is perhaps one of the most popular and versatile utilities for source code updates, offers support for many of today’s most popular languages and a few of the older ones as well. When I first started using VSCode, I did a quick search for a Replang extension. Unfortunately, Replang wasn’t one of the supported programming languages, but VSCode does allow developers to build extensions, which are similar to plug-ins in Notepad++.

Prior to creating the extension, I also tried a number of the available supported languages in VSCode to see if anything came close. Some of the best candidates helped a little, but I was disappointed with the results. Out of the gate, VSCode provides line numbering and many other useful features. Frankly, the only reason to ever use Notepad again is because it is always there and it is simple to use.

In order to provide language support for Replang in VSCode, I needed to create an extension with knowledge of Replang’s keywords. Replang for Axys has roughly a hundred keywords, and the most current versions of APX add another hundred-plus keywords. Building a truly robust extension for Replang would mean spending more time than I put into it on the day I created it. Ideally, you could provide keyword-specific information with examples that would appear when you hover over a keyword. Eventually, I may build that into the extension, but the most critical feature in my mind is to provide contrast between keywords, comments and dialog to highlight the syntax and make it easier to read.


Example: Modifying Replang code with Visual Studio Code using the Replanguist extension.

If you routinely modify Advent Reports and are looking for an improved tool to do so, you may want to check out the Replanguist extension I built and published to facilitate Replang edits. You should be able to find it in the list of available VSCode extensions from Microsoft.

As always, if you have questions or suggestions, please feel free to reach out and connect with me.


Kevin Shea Impact 2010

About the Author: Kevin Shea is the Founder and Principal Consultant of Quartare; Quartare provides a wide variety of technology solutions to investment advisors nationwide.

For details, please visit Quartare.com, contact Kevin Shea via phone at 617-720-3400 x202 or e-mail at kshea@quartare.com.