Tag Archives: automated betting

June ’17

The big event of June was the release of a beta streaming version of Gruss. As it is in development, the Gruss guys requested feedback and there were a number of bugs highlighted by users. We are now on the sixth release, which is good to know the program is being worked on and improved. The bug that I found related to moving on from suspended markets but after I fed back via the forum, a fix was quickly released. It took a little time for me to grasp the full effects of streaming. At first I thought the refresh was poor as updates were very random. The stream only updates when there is something to update, ie market activity, and doesn’t waste bandwidth by refreshing the same data repeatedly, as before streaming. This gives a refresh chart that can have quite large gaps between updates, especially on markets with some time to go before off. Now, with only a few minutes to off, the refreshes come in more than every 200ms. As there are no requests for price data, there’s no added delay. To note, the charts on the VPS are showing lower times than those at home. The request delay will still be relevant when placing orders in the market but no data is available on what it is.

Another major event, for me, was the changes to the Aus turnover eligibility. I posted about this here. I’m on with the coding around this. I have a section of code that only runs once when a market is selected and then isn’t run every refresh. I’m adding the NSW code there, which was straight forward for checking against the list of courses but tracking the traded back bets over a week is a little more complicated.

The UK dogs have done good this month. I’m considering adjusting the stake range to allow higher bets. I want to trial it on specific markets first. I’m thinking of those that are televised and tend to have much higher activity.

chart_ukdogs170630

An improved chart from the Aus horses compared to recent months. It’s nice to see a good return as I was beginning to lose patience with it and was considering stopping this bot. It’s been a long time since I ran Oscar on the UK horses as there was no value in it for me and I was thinking the Aus horses were going the same way. They still might, to be fair. But for now, with this, it will continue.

chart_aus_horse170630

US horses – better result than not trading them at all. There are some really well funded races that I’m missing, purely down to start times been well off. I’ll continue moaning about this point until I finally get a solution in place (I’ve had some good suggestions from you but the code don’t write itself, I should get on with it).

chart_ushorse170630

Definitely lower activity on the dish-lickers. An unfortunate loss keeping the return in the negative. Not much harm in continuing for now, I consider this my experimental contribution (why not?)

chart_aus_dogs170630

May ’17 – and the art of separation.

Work on the VB bot was frustrating me so I decided to pause it and have a play with Oscar, my VBA bot. I’ve nearly always run one instance of Oscar, navigating between different markets and sports based on some preset criteria. I decided to split the sports, running an instance for UK dogs and one for Aus horsies. This has the benefit of not missing conflicting events across the two sports. The reason Oscar didn’t do this originally is because back in the day, Betfair charged for making excessive calls above a relatively low amount. This changed some time back but I hadn’t.

This new set-up runs well and so I added Aus dogs, also running in its own instance. And why not US horses? OK, they are now covered in another instance. (Previous attempts at US horses had not seen many trades but most races were missed in favour of the other markets.)

I’ve monitored some cross-over times on the VPS and I haven’t seen any drop in performance. At some points in the morning, three of the bots are running at 0.2s refresh rate but I’m still getting a delay of less than 20ms on each.

I did notice the other day that the US horses were buggered by some error in the stated off time. For one venue the times were in the quick pick list but when the markets were selected, the off time was around an hour and a half out. This may have been an API issue but if I see it again I’ll look at coding to handle the mismatch between the two.

Next, Chris commented on Speedy data 2  –

Very interesting articles about bot speeds. I have been looking into the same. I have looked at my algorithms and have improved them. They now return values within 3-4 ms. However the main bottle neck is the price refreshes. Without streaming they currently have a price refresh at 200ms, my prices can be 180 ms out of date. If I was able to implement streaming I could improve my robots speed by a huge amount (probably 100ms), dwarfing any gains that could be made over optimising my robots. So I would suggest that the bottleneck is in your price refreshing and you could see a large improvement with your bots if you were able to stream prices.

Thanks for the comment. 3-4ms is fast and I haven’t seen those speeds from my bots yet. How are you timing the code? And what language are you coding in? The arrival of streaming made me less eager to push on with the VB bot as I don’t want to put all the time in to get the code perfect just to see it become old-hat overnight. Gruss, the software I use for my VBA bots, are releasing a beta streaming version soon. When I’ve had a go with that, I’ll look at how to stream with VB. There’s no point at all in not streaming if it’s faster and as reliable (collective eye roll) as the API-NG.

And now, some charts.

UK dogs have done ok, nice steady performance.

chart_ukdogs170531

Aus horses continue to throw some bad results. The three sharp drops in this chart have different causes. The first is actually 3 losing markets together, so no problem there. The second was an error, a problem I haven’t seen for a while where an extra lay is submitted for some reason. I’ve previously thought this is down to timing and the bot missing signals at specific points, eg when greening occurs and a bet is taken at the same time as a CANCEL-ALL command is triggered. The third was just a bad run of multiple bets being placed within the stoploss window, all eventually losing trades. When I’ve attempted to overcome this particular event in the past, the number of trades significantly reduced. I may look at this again, specifically in the Aus horse markets, but with a more complex solution.

chart_aus_horse170531

NEW – Aus dogs, although not many markets, has a good looking chart, certainly one to watch. Stakes are still hovering around £2 for now.

chart_aus_dogs170531

NEW – US horses, only 3 days here so wait and see what happens in June. Interesting to see average bets per market at 9.9, with other sports being

UK dogs = 5.6

Aus horses = 6.1

Aus dogs = 2.9

chart_ushorse170531

You can find all of Oscar’s UK dogs charts on a single page now – see here.

 

Australian greyhounds and other activities

As mentioned in a previous post, an Oscar clone has been let out of the traps (ha ha, out of the traps – greyhounds, traps, letting out of the… yeah, yeah) and has been trading the Australian greyhounds for the past week. The liquidity is generally lower on these markets, especially for the earlier races but some of the later races are ok. With settings almost identical to those for the UK, only 27 races saw any trading with a total 115 bets settled. Profit of 27 pence (coincidentally) and £270.69 traded, giving a P&L/TV of 0.1%, which is not an unusual figure for Oscar. I’ll let it run for now to see what results come in.

aus_dogs_170521

Other activities

I know many of you will have it on your mind but for those that are new here – back in October I did a post on stakes ending with me wondering what to do with any profits. I gave four possibilities, as I saw it –

  1. Leave it where it is, doing nothing.
  2. Create a second Betfair account, for other/future bots or split activities.
  3. Remove from Betfair, put it in savings to be returned to Betfair when required.
  4. Remove from Betfair, spend it (I doubt this will happen).

Option 2 is not an option. I thought option 3 would be the one to go with but the wife suggested (after pointing out that there wasn’t vast amounts to play with ” and after all that time you spend on the computer“) that I should buy something – option 4 – which was MY LEAST favoured option. So after thinking it over and realising that she was right, because that’s how it is, I decided to part company with my faithful 20+ year old mountain bike (Claud Butler frame swapped for a scrap ’85 Ford Escort, crank and gears from a previous GT, all other parts swapped/added separately pre 2000, except tyres) and purchase a brand spanking shiny new modern lighter full suspension ally framed disk braked, mountain bike. I’ve been using it regularly for the past few months to try (really try) to improve my fitness. This is most certainly a work in progress.

img_20170114_140522068
Old
img_20170114_140422315
New (when it was new)

(p.s. I still have the old bike, just can’t bring myself to drop it at the tip after all these years)

 

Speedy data 2

My Speedy data post generated a few comments and some discussion. I really appreciate people taking the time to get involved and share their knowledge and views.

The first comments came via twitter from TraderBot (here and here) with a link to stackoverflow. This is a site I’ve found to be really useful to get help with many programming issues in multiple languages (That reminds me, I keep meaning to do a list of what I use – apps and sites). The Q&As linked to, although relating to Java, are an interesting read with an answer to the speed question, in summary, of “it depends on what exactly you want to do/measure”.

LiamPauling commented on the post asking where I’m hosted and do I stream? I’m cloud hosted and not streaming. He continues that he thinks bottlenecks are more likely elsewhere, which, after further reference in later comments, seems to be a good point.

Betfair Pro Trader asked why I wanted to use an array. It’s not that I want to use an array more than any other data structure, I was looking at getting the best solution, if such a thing exists (which is becoming less clear).

Tony, via Twitter, suggested running a test code with the different structures used. This could be useful but I was put off from this initially by the confusion I was getting from reading differing opinions based on various implementations of arrays, collections and dictionaries (and later, lists). At this point I was thinking that the optimum stucture is dependant on the specific use and there isn’t an exact answer to my speed question.

Next, a comment from Ken. He points to Lists as it’s something that he uses regularly and he talks of some of the benefits. Again, I’d previously come across articles saying lists were slow but maybe I was too quick to dismiss them. Betfair Pro Trader has also suggested using lists and dictionaries combined. Ken adds that he codes in C# (C sharp) but I think for the purpose of data structures and speed they are similar (they, C# and VB.net compile to the same language and run against the same runtime libraries).

n00bmind added a detailed comment. He makes the point that the advantages of one structure over another are not always so, as mentioned above. Also, he goes on to agree with previous comments that my speed question may be missing the main issues – those being the program/algorithm itself and network latency. Further advice is given about profiling (something, as a specific process, I haven’t come across before) and maybe using a different language, such as Python (I have only a basic understanding of Python from messing with it on my Raspberry Pi).

Finally, Jptrader commented, agreeing mostly with n00bmind, and others, about looking at “handling network latency properly and doing performance profiling”.

Although a simple answer hasn’t been found (because there isn’t one), I’m guided by these comments to focus more on my code, handling serialization and latency, making the algorithm efficient and using the data structures that work for now, whether that’s arrays, collections, dictionaries, lists or a combination of. Moving to another language just isn’t feasible for me at the moment, it’s taken me over a year to get a running bot in VB, with limited hobby time. I am happy to accept that another language may have it’s advantages, so would advise others to look at this for optimising their bots performance (for me the advantage will be seen moving from VBA to VB.net).

The testing I’ve done hasn’t shown any particular advantage of the different structures. From my searches on the web I think this could be due to the relatively small amount of data I’m handling (many articles talk of data lines in the 10s to 100s of thousands when comparing structures). An error on my part also had me making double calls for data with my bot which added to my difficulties and questions initially.

I have plenty to be getting on with for now and will continue looking to improve my bots. Thanks again for all the comments.

Speedy data

Speed is an important part of my bot development. When it comes to storing data, the options (data structures) I’ve been working with are array, collection and dictionary. When I Google for articles on speed, I get lots of information pointing to dictionaries as being the fastest for programming. But the main point of interest in the articles is the speed of looking up data. To find something in an array, all elements have to be looped through until it’s found (or isn’t). The use of keys in collections and dictionaries makes lookup faster as they can be targeted without looping, as long as you know the key(s). There are other advantages that make dictionaries preferred to collections, however they seem less important when looking at speed. A disadvantage of arrays comes when changing its size. Collections and dictionaries can have elements added without problems. To do the same with an array, we have to change its size to accommodate more data, which involves more time as the original is put into temporary memory and a new, re-sized, array created with the existing data then added to it.

This leaves me thinking that the two main speed disadvantages of arrays is searching for data and resizing. Here’s the question I’m looking to answer – if I know the size I want the array to be and I know the location of all the data held in it (so I can refer to each element directly) is there a speed benefit to using collections or dictionaries? Any help appreciated.

Weeks ending 12-03-17

Well nobody spotted last weeks howler – I only titled it “Weeks ending 06-03-17”. I guess you did see it but found more amusement in keeping quiet. You are fun.

algotradingforfun added this comment-

Great 2nd week there. Need to think about handling the bf crash scenario when in autopilot. I don’t think it would be a disaster if not about but does create some extra risk.

Thanks. For me the crashes can be a bit annoying. Oscar backs first so the greatest loss is the stake, assuming a clean cut crash. If you’re laying first the exposed risk between entry and exit is far greater, add multi-runner trading and that increases, something to consider when setting up a bot.

 

Mike also commented-

The regular Betfair crash is a royal pain. Your take of their response is amusing and spot on. There is an API status page (not widely publicized) which is a little more real time than the “help” desk. Don’t know if you can link your bot to the status but might be an option. http://status.developer.betfair.com/

Thanks, again. A pain, agreed. I saw this status link on Twitter for the first time after this last crash and it does provide some confirmation but did seem a bit delayed. After I’d first seen the tweets I looked at the status and only one request was showing problems (/listmarketcatalogue maybe?) so trial and error would see if it could provide any bot use. But it was certainly ahead of the Saturday boy and his well thumbed guide.

 

One week on these charts. Interesting profile on the dogs, start flat, end flat, with sharp rise Friday/Saturday. All figures are in line with previous period which is good.

170312

Aus170312

Another milestone was passed with these results, I became eligible to pay premium charge as my lifetime percentage dropped just below 20 to 19.92%. I’d already used some of my allowance which I think was linked to data charges that are no longer used. So this week saw £1.98 taken off my allowance; at that rate it’ll be 9 years before I actually pay anything. Unfortunately, if my total charges percent continues to fall, the weekly PC will rise. A drop to 19.72% would have seen a PC of £5.50. This is the price of (small) success. On a positive note this does put me in a bracket with 0.5% of customers which, if Wikipedia can be believed, is either 20,000 or 5,500 people. What joy.

Weeks ending 06-03-17

Two weeks this period, starting from 20:30 on the 19th (see last update). A good return from both the dogs and horses but the first week ended barely up. The second week was one of the most profitable I’ve had.

170305Aus170305

Saturday 4th March saw another big Betfair crash* with the exchange offline for nearly an hour and betting disabled for some time after that. As I was at the computer when it happened, I took the opportunity to run updates on the VPS OS. I have it set up to tell me when updates are available but I choose when to install them. I’d advise any botter to do this, as the last thing you want is the computer restarting mid-trade after auto-updating. I also decided to add a bit of code to change how and when the bot saves it’s log sheet. A simple enough task as I’d already written the code for another, now retired, bot. Copy, paste, change sheet references and save location, job done, what could possibly go wrong? Not testing with live updating and thinking I know best and the bot locking up at 2am trying to repeatedly save a file that at the second attempt already exists, is what can go wrong. I’ve said it myself before now – always check, check again and test live. And check again. No harm done but missed most Aus horses on Sunday. Added to the missed Saturday dogs, the results are even better.

* All major exchange crashes seem to follow a similar pattern. Some people start reporting blips, pauses in the refresh rate. Betfair Customer Service (a questionable department title if ever there was) denies all knowledge with the stock phrase “It’s all good here”. Then comes total blackout. Many an unrepeatable turn of phrase screamed by the loyal customer base and the Betfair bods half acknowledge with the second phrase of the Betfair How to Keep Customers in Suspense Guide – “We’re looking into it”. Third line, after some threats of violence, use of very specific graphic language and calls for a mass exodus to Betdaq, is- “Apologies for this guys, our techs are on it”. Following a period of silence from the exchange masters, allowing for a build up of calls for refunds and shared stories of thousands lost, comes the market controlling monopoly confirming legal statement – “Refer to our Ts&Cs”. The first few markets after reboot are played cautiously before all but the over-exposed carry on as usual, allowing said monopoly off the recently polished hook.

Weeks ending 19-02-17 (8:30pm)

As previously mentioned I’ve been using my time to code my own trading bot, so this catch-up covers seven weeks by my reckoning. On the dogs the return has been ok at 0.055%. I’m still running on stakes that are not linked to balance but vary within a small  amount. All the dog’s weeks ended positive, just.

170219

The Aus horses performed better for the period, at 0.084%, but as can be seen from the chart, there was a negative run. This stretched over approximately 10 days. Combined with a low profit for a week from the dogs, this saw the first overall weekly loss for quite some time, ending -£2.62. It does have an emotional impact, even though a small loss, after such a long time seeing the bank’s increased at the end of each Sunday. I did feel like I wanted to change something in the code but I held tight and the P&L returned to a more usual level. This is why I couldn’t trade manually – I’d drift away from the plan after each loss in an endless battle against the now.

aus170219

 

 

2016 – a brief history (and Week ending 01-01-17)

I’ll start with last week. Dogs were down on number of markets/bets/volume but up again on return. It’s the highest return since July.

170101

The Aus horses had a good week too. Although return was down slightly, -0.007, the overall trend is good. Last period saw two large losses on the dogs, this time it’s the horses turn. No worries though. Next, on to the annual review.

aus170101

Annual review

It’s been a positive year with a profit from most weeks, there were only three weeks ending in an overall loss (across all markets), with one of those due to an error. This year also saw the move to a VPS and then cloud VPS, giving a faster and more consistent connection. I’ve dabbled with my other bots but not taken them live. I tried Oscar over the pond for a while but the US horses were a loss. Slayr was an accident and started out backing, then switched to laying, then back to backing (small number bias and all that). It is still running on Betdaq, an update on that soon, but it doesn’t trade as such, just places a bet based on the prices available. My work on my own trading app stalled but this is going to be my main focus this year. I’ve got Betfair Pro Trader’s new book which is providing inspiration and ideas. I’ve also made myself a plan which I hope will prevent me getting easily distracted and bouncing between one idea and another.

High and low

This years high has to be the brilliantly consistent(ish) profits returned by Oscar. Regardless of actual amounts, the same trading algorithm has performed for the year without many code changes, most of which were not to do with triggers or execution.

The low was the crash of Thursday 26th May at approximately 20:42 which resulted in a loss of £59.40. It set my bank back by around 5 weeks. And it hurt. But, on-wards and upwards as they say.

The year in numbers

The charts I show on the blog include some amounts for the period. The figures for the period 4th January 2016 to 1st January 2017 are as follows –

Number of markets traded = 26775

Number of bets settled = 197158

Traded volume = £1039959.93

P&L = £717.25

PL/TV = 0.069%

Traded volume is probably an incorrect term as it’s taken from finance but never mind. It’s my way of measuring and is simply the total of the amounts going in and out of my account, ie money I’ve traded with other bettors.

The blog

I’ve enjoyed doing the blog. It’s useful for me to be able to look back at what I’ve been doing. I also hope it provides some sort of information/ideas/entertainment/wonder to others. My page views have gone up over the year, mainly with links from Twitter, Betfair Pro Trader and Green All Over. I seem to get views from all across the world – 60% are from the UK but I’ve had hits (according to WP stats) from the following countries Australia, Sweden, Ireland, Portugal, Denmark, Czech Republic, Switzerland, Netherlands, Spain, France, United States, Norway, Italy, Greece, Germany, Slovakia, Brazil, Slovenia, Hungary, Poland, Romania, Russia, Hong Kong SAR China, New Zealand, Cyprus, Mexico, Bulgaria, Argentina, South Africa, Latvia, Finland, India, Turkey, Ukraine, Namibia, Malta, Peru, Austria, Malaysia, Serbia, Lithuania, Taiwan, Macedonia, Columbia, Canada, Croatia, Tunisia, Luxembourg, European Union (What? A country? Yes, according to wordpress), Philippines, Chile, Georgia, Belgium, Nigeria, Isle of Man, Singapore, Mauritius, Yemen, Pakistan, Uzbekistan, Thailand, South Korea, Tanzania, Israel, Iceland, Azerbaijan and, last but by no means least, Zimbabwe.

The future

With a certainty that is unknown in the betting/trading/gambling world, the future is definitely coming. As for what or when I do anything, I don’t want to make any promises. I do intend to become a more proficient VB programmer and a better bot trader, in that order.

Happy New Year!

Week ending 25-12-16

One week. A good end to the week. Monday/Tuesday were flat but the rest of the week ran at an increased pace returning a very good 0.107%. The two large losses, 2.24 and 2.26, were both in average markets, no low matched volumes or out of the ordinary movements that I can see. With one and a half days less markets this period, there was only 4 less markets traded and 9 less settled bets than the previous week. This may be down to more participants as people were finishing for the holidays.

161224

A similar feature to the Aus chart as on the dogs – Mon/Tues flat – but here the increased profit curve runs through Weds/Thurs/Fri, then flattens off for the Christmas Eve markets. It’s still the best return for a period on the Aus horses since the end of September. Look forward to putting the years trading results in the first Annual Review.

aus161224