 
                            
Microsoft has apologised for racist and sexist messages generated by its Twitter chatbot. The bot, called Tay, was launched as an experiment to learn more about how artificial intelligence programs can engage with internet users in casual conversation. The programme had been designed to mimic the words of a teenage girl, but it quickly learned to imitate offensive words that Twitter users started feeding it. Microsoft was forced to take Tay offline just a day after it launched. In a blog post, the company said it takes "full responsibility for not seeing this possibility ahead of time."

 ADNOC Distribution reports $579 million net profit in first 9 months
            ADNOC Distribution reports $579 million net profit in first 9 months
         TECOM Group’s 9-month shows 20% revenue growth
            TECOM Group’s 9-month shows 20% revenue growth
         DFM reports 212% increase in net profit before tax to AED930.8 million
            DFM reports 212% increase in net profit before tax to AED930.8 million 
         DMCC unveils plans for new financial centre
            DMCC unveils plans for new financial centre
         UAE cuts key interest rate by 25 basis points
            UAE cuts key interest rate by 25 basis points
         
                 
                 
                 
                