Retrieving Qonto’s transactions via the API and converting the CSV format to match our accounting tool

In this article, I show you how to use Qonto's API, as well as how to retrieve the transactions and export them into CSV.
2020.07.07

この記事は公開されてから1年以上経過しています。情報が古い可能性がありますので、ご注意ください。

Congratulations Classmethod, on the 16th anniversary! As today is the date Classmethod was founded in Japan, we celebrate it by writing blog posts and spreading our knowledges on our prestigious tech blog, Developers.IO :)

Today I am going to write about a French neobank "Qonto" and how to utilize the API.

Classmethod Europe has been using Qonto as one of our main banks since last year. While Qonto was launched in France, the service is currently also available in Spain, Germany, and Italy. Qonto has user-friendly interfaces and a lot of innovative features. If you live in EU, you may have heard good rumors about N26, which is also famous for the neobanking with flexible options. I'd say Qonto is as incredibly user-friendly as N26, and moreover designed for enterprises.

For each transaction, you can choose a category and VAT rate, add a note and custom labels, and even attach related files. Varieties of filters are available to search them.

One of the attractive features on Qonto is API. You can retrieve transactions or show attachments by API -- without opening a browser, logging into the online bank, and going to the History page. Why do we waste our time not using this great opportunity?

One thing to consider when you start using Qonto is, that it could be possibly not supported to associate your accounting tool with Qonto and to fetch transactions automatically, since it's not a traditional bank. We use Lexoffice and unfortunately it doesn't support the feature at the moment, which means we need transaction data in a formatted CSV so that we can manually import it into Lexoffice.

I'll show you how to fetch Qonto's transaction data using API, manipulate it to match the Lexoffice's format, and save it in a CSV by using Python.

Get started with API

First things first. You need authentication for every API request, in which you include your API login and secret key. To find these, sign into Qonto online, click your company name, choose the Settings, and then click Integrations (API) in the menu.

To fetch transactions, you also need "slug" and "IBAN". On the web app (GUI), you can find IBAN as a part of your bank information in the Details and Statements page. And in our case, "slug" is the same as the API login.

According to the official document, you get those information by executing Show Organization request. Again, "slug" is same as the API login for us, but it seems to differ if you have multiple bank accounts under an organization. Here's an example of the curl command. The command can be executed in various languages including Go, Java, and Python.

% login="classmethod-***"
% secret_key="***"
% curl --request GET \
  --url https://thirdparty.qonto.com/v2/organizations/%7Bslug%7D \
  --header "authorization: ${login}:${secret_key}" \
  --data '{}' | jq .
{
  "organization": {
    "slug": "classmethod-***",
    "bank_accounts": [
      {
        "slug": "classmethod-***-bank-account-1",
        "iban": "FR1234567890123456789012345",
        "bic": "QONTOXXXXXX",
        "currency": "EUR",
        "balance": xxx,
        "balance_cents": xxx,
        "authorized_balance": xxx,
        "authorized_balance_cents": xxx
      }
    ]
  }
}
[Updated on 08.26.2020] Since August 18th, Qonto changed the domain from .eu to .com. Also, it seems there were some updates on API too. HTTP no longer works and HTTPS is now necessary. The domains and https are all updated in the codes in this blog. I made the following additional change in the code above to follow the latest document.

● %7Bid%7D --> %7Bslug%7D

List Transactions

By using this List Transactions request, you can get the detailed transactions made in your account. Some information, such as reversed transactions, cannot be even viewed on GUI and are available only via API requests. The following filters are available:

  • Status: pending / reversed / declined / completed (default)
    • 'pending' is 'processing' and 'completed' is 'settled' on GUI. 'declined' can be fetched only via API.
    • e.g. &status[]=completed or &status[]=completed&status[]=pending for multiple choices.
  • Start and end date: updated_at / settled_at
    • 'settled_at' is the time when the transaction was settled. It is applied only to completed transactions.
    • 'updated_at' is the time when transactions were last updated. The updates include the status change from pending to settled, as well as changes made by users, such as adding a note, changing the category, or choosing a VAT rate on GUI.
    • You specify '_from' and/or '_to' and a date/time in a valid format (ISO8601), following 'settled_at' or 'update_at'.
    • e.g. &settled_at_from=2020-06-01, &updated_at_from=2020-01-10T01:00:00.000Z&updated_at_to=2020-01-20T01:00:00.000Z
  • Sorting: asc / desc (default)
    • You can sort the fetched transactions by either the settled date or the updated date. By default, settled date is used.
    • e.g. &sort_by=asc (same as &sort_by=settled_at:asc), &sort_by=updated_at (same as &sort_by=updated_at:desc)

For example, with the following Python scripts, you can fetch the records regarding the current pending transactions. *Don't forget to put [] after "status"! Otherwise, the status filter will be simply ignored.

ls_transactions.py

# -*- coding:utf-8 -*-
import http.client
import json

login = "classmethod-***"
secret_key = "***"

conn = http.client.HTTPSConnection("thirdparty.qonto.com")
payload = "{}"
headers = { 'authorization': "{}:{}".format(login,secret_key) }

conn.request("GET", "/v2/transactions?sort_by=updated_at:asc&status[]=pending&iban=FR1234567890123456789012345&slug=classmethod-***", payload, headers)

res = conn.getresponse()
data = res.read()

output_json = json.loads(data.decode("utf-8"))
output = json.dumps(output_json, indent=3)
print(output)

Here is the example response:

{
   "transactions": [
      {
         "transaction_id": "classmethod-***-215",
         "amount": 0.89,
         "amount_cents": 89,
         "attachment_ids": [],
         "local_amount": 1.0,
         "local_amount_cents": 100,
         "side": "debit",
         "operation_type": "card",
         "currency": "EUR",
         "local_currency": "USD",
         "label": "AWS EMEA aws.amazon.co LU",
         "settled_at": null,
         "emitted_at": "2020-06-30T07:26:00.000Z",
         "updated_at": "2020-06-30T07:26:05.658Z",
         "status": "pending",
         "note": null,
         "reference": null,
         "vat_amount": null,
         "vat_amount_cents": null,
         "vat_rate": null,
         "initiator_id": "69f5ab13-82c6-4478-b35b-***",
         "label_ids": [],
         "attachment_lost": false,
         "attachment_required": true
      }
   ],
   "meta": {
      "current_page": 1,
      "next_page": null,
      "prev_page": null,
      "total_pages": 1,
      "total_count": 1,
      "per_page": 100
   }
}

As seen in the result, three different dates are included in a transaction. In this case, 'settled_at' is null because the transaction is pending and not yet settled. 'settled_at' and 'updated_at' are explained above, meaning the same as filter components. 'emitted_at' is the time when the transaction was first recorded.

You can filter transactions and see the same results on GUI and export the results to CSV data. However, these dates are displayed differently. As far as I researched, date values work as below. The format of dates in the CSV is DD-MM-YYYY HH:mm:ss whereas API returns it in the format of YYYY-MM-DDTHH:mm:ss.sssZ in the UTC timezone.

CSV exported on GUI GUI API response
settlement_date_utc - settled_at
settlement_date_local used as a date in filter if the transaction is completed -
- - updated_at
value_date_utc - emitted_at
value_date_local Displayed as 'Payment date' in the transaction list;
Used as a date in filter if the transaction is pending
-

Also, 'amount' in an API response is always positive. Instead of showing it with +/-, each data has 'side' component. credit stands for incoming transaction and dedit for outgoing transaction. In the example above, the pending transaction is an expense.

List Memberships

In the previous example, the initiator_id returns the alphanumeric ID like "69f5ab13-82c6-4478-b35b-***". This means somebody among the Qonto users made the transfer or used his/her card, but you cannot tell who, only with such letters. Now, you need to combine the results of List Memberships in order to know who this is.

ls_memberships.py

# -*- coding:utf-8 -*-
import http.client
import json

login = "classmethod-***"
secret_key = "***"
conn = http.client.HTTPSConnection("thirdparty.qonto.com")
payload = "{}"
headers = { 'authorization': "{}:{}".format(login,secret_key) }

conn.request("GET", "/v2/memberships", payload, headers)

res = conn.getresponse()
data = res.read()

output_json = json.loads(data.decode("utf-8"))
output = json.dumps(output_json, indent=3)
print(output)

Here's the sample response.

{
   "memberships": [
      {
         "id": "69f5ab13-82c6-4478-b35b-***",
         "first_name": "Ma",
         "last_name": "Ito"
      },
      ...
   ],
   "meta": {
      "current_page": 1,
      "next_page": null,
      "prev_page": null,
      "total_pages": 1,
      "total_count": 5,
      "per_page": 100
   }
}

Retrieve transactions into CSV

Finally, by executing the following script locally, I fetch the settled transactions based on the specified period, review the pending transactions in the period, and export the settled records into CSV, which is compatible with the Lexoffice's format. This is how it looks like.

  • Timezone

When the script is executed, it prompts the start date and end date. As you can see in the request parameters in the above image, the specified period is from 22:00:00 to 21:59:59 in UTC. This is from 0:00:00 to 23:59:59 in our timezone CET (UTC+1 or UTC+2 during summer time). I wanted the data in our local CET timezone, and also in order to retrieve data in the same way as GUI, which filters dates in the local timezone, I add these start/end time in CET in the script. To cope with the complexion of summer time, I use the module "pytz" and convert the timezone in the script.

Below is the whole script.

get_csv.py

# -*- coding:utf-8 -*-
import http.client, sys
from datetime import datetime, timedelta
import json, csv, pytz
import urllib.parse

# Credentials
login = "classmethod-***"
seckey = "***"
iban = "FR1234567890123456789012345"

# Define timezones
utc_tz = pytz.timezone('UTC')
local_tz = pytz.timezone('Europe/Berlin')

TRN, MEM = "transactions", "memberships"

class Color:
	RED       = '\033[91m'
	BOLD      = '\033[1m'
	UNDERLINE = '\033[4m'
	END       = '\033[0m'

def date(x):
	"""
	Prompt a date. User can skip inputting an end date.
	
	Args:
	    x (str): "start" or "end"
	Returns:
	    date: YYYYMMDD
	"""
	date_q = x.capitalize()
	date = input(Color.BOLD + f" ? {date_q} date [YYYYMMDD]: " + Color.END)

	# Validate the input value
	if date.isdigit() and len(date) == 8:
		return date
	else:
		# Retry the input prompt
		print(Color.RED + "\nYou entered a wrong value. Please try again." + Color.END)
		return date(x)


def conv_date(s, e):
	"""
	Convert the request filter's start/end date to the date with time, 
	0:00:00 for start and 23:59:59 for end. Then convert it from local timezone to UTC and encode it.

	Args:
		s (str): start_date (YYYYMMDD)
		e (str): end_date (YYYYMMDD)
	Returns:
		endate: encoded UTC datetime
	"""
	sdt = datetime.strptime(s, '%Y%m%d').replace(hour=0,minute=0,second=0,microsecond=0)
	edt = datetime.strptime(e, '%Y%m%d').replace(hour=23,minute=59,second=59,microsecond=999999)
	dates = []
	for dt in sdt, edt:
		dt_local = local_tz.localize(dt)
		date = utc_tz.normalize(dt_local).strftime('%Y-%m-%dT%H:%M:%S.%fZ')
		endate = urllib.parse.quote(date)
		dates.append(endate)
	return dates


def filter(dates, status):
	"""
	Return filter parameters with a status, the specified dates and a date filter option,
	which is decided based on the status.

	Args:
		dates (list): encoded start/end datetime in UTC
		status (str): "update" for declined/pending, or "settle" for completed
	Returns:
		filters: The filter of status and period (&status[]=...)
	"""
	if status == "update":
		status_fil = "&status[]=declined&status[]=pending"
		date_pre = "&updated_at_"
	elif status == "settle":
		status_fil = "&status[]=completed"
		date_pre = "&settled_at_"
	sdate_fil = date_pre + "from=" + dates[0]
	edate_fil = date_pre + "to=" + dates[1]
	filters = status_fil + sdate_fil + edate_fil
	return filters


def log_updates(data):
	num = data['meta']['total_count']
	print(Color.BOLD + "There are {} non-settled records.".format(num) + Color.END)
	for trn in data["transactions"]:
		status = trn['status']        # status
		date_utc = trn['updated_at']  # last updated date
		amt = trn['amount']           # amount
		side = trn['side']            # credit or debit 
		lab = trn['label']            # counterpart
		ref = trn['reference']        # reference
		# Convert date and amount
		date = conv_utc(date_utc)
		amount = conv_amount(side, amt)
		# Log the results
		print("{} (Last update: {})".format(status, date))
		print("Counterpart : " + lab)
		print("Amount      : " + amount + " EUR")
		if ref:
			print("Reference   : " + ref)


def get_completes(data_mem, data_set):
	# Get the number of completed transactions
	num = data_set["meta"]["total_count"]
	print(Color.BOLD + "{} transactions found.".format(num) + Color.END)
	if num == 0: return

	# Create a dictionary of members
	members = {}
	for mem in data_mem['memberships']:
		member_id = mem['id']
		fname = mem['first_name']
		lname = mem['last_name']
		fullname = fname + " " + lname
		members.update({member_id: fullname})

	# Set an array to insert values for CSV
	trns = []	
	for trn in data_set["transactions"]:
		amt = trn['amount']                # Betrag
		lamt = trn['local_amount']         # Zusatzinfo (foreign amount)
		side = trn['side']                 # Betrag (credit/debit)
		op_type = trn['operation_type']    # Zusatzinfo (card/transfer/qonto_fee etc)
		lcur = trn['local_currency']       # Zusatzinfo (foreign currency)
		lab = trn['label']                 # Auftraggeber&Empfaenger
		book_date_utc = trn['settled_at']  # Buchungsdatum
		# value_date_utc = trn['emitted_at'] # Wertstellungsdatum
		note = trn['note']                 # [opt]Zusatzinfo (user's note)
		ref = trn['reference']             # [opt]Verwendungszweck
		user_id = trn['initiator_id']      # [opt]Zusatzinfo (user id)
		# Convert date and amount
		book_date = conv_utc(book_date_utc)
		# value_date = conv_utc(value_date_utc)
		amount = conv_amount(side, amt)

		## Reference
		if not ref:
			ref = op_type
			if user_id:
				user_name = members[user_id]
				ref = ref + " " + user_name
		## Additional Info (foreign transactions and/or note, or inserting a letter to prevent a record from falling out)
		l = []
		if lcur != "EUR":
			lamount = conv_amount(side, lamt)
			local = lamount + " " + lcur
			l.append(local)
		if note: l.append(note)
		addinfo = ' '.join(l)
		if not addinfo: addinfo = "_"

		trns.append([book_date, lab, ref, amount, addinfo])
	return trns


def get_qonto(typ, filters):
	"""
	Get Qonto records with the specified parameters and (if applicable) filters.

	Args:
		typ (str): TRN or MEM
		filters (str): filters of a request; applicable only for TRN and consisted by status and dates
	Returns:
		output: json output of the fetched data
	"""
	payload = "{}"
	headers = { 'authorization': login + ":" + seckey}
	param = typ  # for transactions, addtl filters in the parameter
	if typ == TRN: 
		param = param + "?iban=" + iban + "&slug=" + login + filters
		print("\nRequesting to Qonto with the following parameters: ")
		print("{}\n".format(param))

	try:
		conn = http.client.HTTPSConnection("thirdparty.qonto.com")
		conn.request("GET", f"/v2/{param}", payload, headers)
	except Exception as e:
		print(e)
		print(Color.RED + "Error occurred while requesting {} to Qonto. Filters: {}".format(typ, filters) + Color.END)
		sys.exit()

	res = conn.getresponse()
	data = res.read()
	output = json.loads(data.decode("utf-8"))
	return output


def conv_utc(date):
	"""
	Convert the UTC timestamp in fetched records to the timestamp in the local timezone.

	Args:
	    date (str): UTC timestamp (YYYY-MM-DDTHH:mm:ss.sssZ)
	Returns:
	    date_local: Date converted to local TZ (YYYY-MM-DD HH:mm:ss)
	"""
	dt_utc = utc_tz.localize(datetime.strptime(date, '%Y-%m-%dT%H:%M:%S.%fZ'))
	dt_local = local_tz.normalize(dt_utc)
	date_local = dt_local.strftime('%Y-%m-%d %H:%M:%S')
	return date_local


def conv_amount(side, amount):
	"""
	Convert positive amount to the amount with positive/negative side.

	Args:
		side: "credit" or "debit"
		amount (float): positive amount
	
	Returns:
		amount (str): amount with positive/negative side
	"""
	if (side == "credit"): amount = amount * -1
	return str(amount)


# Specify the period of filters for a Qonto request
start_date = get_date("start")
end_date = get_date("end")
dates = conv_date(start_date, end_date)

# Get Pending/Declined transactions data and if exists, display them
filters = filter(dates, "update")
data = get_qonto(TRN, filters)
log_updates(data)
# Get member data and Completed transactions
data_mem = get_qonto(MEM, filters)
filters = filter(dates, "settle")
data_set = get_qonto(TRN, filters)
transactions = get_completes(data_mem, data_set)

if transactions:
	try:
		file_path = "qonto_" + start_date + "-" + end_date + ".csv"
		with open(file_path, 'w', newline='') as csvFile:
			csvwriter = csv.writer(csvFile, delimiter=',',quotechar='"', quoting=csv.QUOTE_NONNUMERIC)
			csvwriter.writerow(['Buchungsdatum', 'Auftraggeber/Empfänger', 'Verwendungszweck', 'Betrag', 'Zusatzinfo'])
			for record in transactions:
				csvwriter.writerow(record)
	except Exception as error:
		print(error)
	else:
		print(Color.BOLD + "The data has been successfully exported to " + file_path + Color.END)

Hopefully this helps you understand the awesome service of Qonto! This would be even more powerful if I implemented it on AWS and automatically issued the fetched and formatted CSV on a weekly basis. I'll write another blog post once I brush it up. Stay tuned!