Deloitte News, Stories and Latest Updates https://analyticsindiamag.com/news/deloitte/ Artificial Intelligence news, conferences, courses & apps in India Wed, 07 Aug 2024 09:05:25 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2019/11/cropped-aim-new-logo-1-22-3-32x32.jpg Deloitte News, Stories and Latest Updates https://analyticsindiamag.com/news/deloitte/ 32 32 Deloitte Inaugurates 4th Office in Bengaluru https://analyticsindiamag.com/ai-news-updates/deloitte-inaugurates-4th-office-in-bengaluru/ Wed, 17 Apr 2024 08:10:21 +0000 https://analyticsindiamag.com/?p=10118387

In March, the company inaugurated three new workplace hubs in Bengaluru Noida, and Pune.

The post Deloitte Inaugurates 4th Office in Bengaluru appeared first on AIM.

]]>

Deloitte has expanded its presence in Bengaluru, India, by opening a new office, its fourth in the city. Located in Yemalur Village, Marathahalli, this office is equipped to house 6,000 professionals and is part of Deloitte’s strategic efforts to grow and expand in the region. This new facility will focus on serving global clients and adds to the existing trio of offices in the city, drawing on Bengaluru’s talent pool and infrastructure.

Register for the workshop on Automating Data Pipelines >>

Members at the new office will work across various domains including artificial intelligence, data analytics, cybersecurity, cloud services, and more. The office, inaugurated by Lara Abrash, Chair of Deloitte US, features technological setups like an XR Studio and Innovation Labs. 

In March, the company inaugurated three new workplace hubs in Bengaluru Noida, and Pune. It has offices in 14 locations (cities) in India, including Bhubaneswar, Coimbatore, Kochi, and Jamshedpur. 

In January, Deloitte stated that it is training over 120,000 employees through its AI Academy. Additionally, the company is investing over $2 billion in global technology learning initiatives aimed at improving skills in AI and related fields.

It also announced a substantial $2 billion investment in the IndustryAdvantage program to improve industry-specific services by integrating generative AI into Deloitte’s solutions. Additionally, Deloitte is expanding its suite of generative AI-enabled accelerators and enhancing its cloud-native platform, Converge.

.

The post Deloitte Inaugurates 4th Office in Bengaluru appeared first on AIM.

]]>
Deloitte Opens Up New Offices in Bengaluru, Noida, Pune https://analyticsindiamag.com/ai-news-updates/deloitte-opens-up-new-offices-in-bengaluru-noida-pune/ Fri, 22 Mar 2024 11:25:06 +0000 https://analyticsindiamag.com/?p=10117033

The company has offices in 14 cities in India, including Bhubaneswar, Coimbatore, Kochi, and Jamshedpur.

The post Deloitte Opens Up New Offices in Bengaluru, Noida, Pune appeared first on AIM.

]]>

Audit and consultancy giant Deloitte has inaugurated three brand-new workplace hubs in India, strategically positioned in the bustling cities of Bengaluru Noida, and Pune. 

These new establishments mark a significant milestone for the company, aligning with its commitment to proximity with its workforce and clientele. “These vibrant offices reflect the energy of Deloitte professionals and our confidence in the immense growth potential of our economy,” said Nitin Kini, chief operations officer, in his LinkedIn post.

With an array of amenities and collaborative spaces, these new workspaces are poised to elevate the employee experience. 

The company has offices in 14 locations (cities) in India, including Bhubaneswar, Coimbatore, Kochi, and Jamshedpur.

Approximately 25% of Deloitte’s global staff, equating to 120,000 employees, is currently based in India. The plan is to increase the workforce to 150,000 and 160,000 employees. This initiative is a strategic part of Deloitte’s global expansion plan, highlighting India’s crucial role in the company’s growth​.

The post Deloitte Opens Up New Offices in Bengaluru, Noida, Pune appeared first on AIM.

]]>
Cypher 2023: Key Highlights (Day 1) https://analyticsindiamag.com/ai-origins-evolution/cypher-2023-key-highlights-day-1/ Thu, 12 Oct 2023 04:10:42 +0000 https://analyticsindiamag.com/?p=10101376

Cypher day 1 was successful with more to come!

The post Cypher 2023: Key Highlights (Day 1) appeared first on AIM.

]]>

Cypher 2023 was jam-packed. The day one of the event witnessed more footfall than we anticipated, close to 1500+ participants and 600+ companies took part in India’s biggest AI conference.

The event kicked off with Karthik Ranganath, general manager of Business IT at Shell R&D, talking about unleashing AI innovations for the better.

In his keynote discussion, Ranganath spoke about how Shell is harnessing the power of AI to make the energy sector more efficient, alongside sharing its partnerships with multiple AI startups to help tackle some of the pressing challenges in the energy sector. 

This was followed by a talk on “Digital Minds” by Jacy Reese Anthis, cofounder at Sentience Institute, who emphasised the relationship between humans and machines, in the backdrop of generative AI advancements

He explained that there is a profound shift in not only how we interact with computers but also how we as a society interact with each other. “We talk to the computer like we talk to a friend with natural language instead of writing commands in code,” he said. 

With the increase in chatbots, there is a danger in forming attachments to the computer which take on human-like characteristics with its language. His philosophical questions were thought-provoking and left the audience reflecting on his ideas. 

George Kuruvilla, the chief data platforms evangelist at SingleStore, shared how the company has built one of the best real-time data management systems, eliminating the need to run to multiple vendors to store, manage and harness data. 

He explained the real-time ease of working with a single platform. He also spoke of the issues that startups face in data management while also explaining how their larger customers work with extensive data. 

Biren Ghose with his jovial personality really connected with the audience.

Ghose spoke about how his company Technicolor Creative Studios has employed all the AI tools that create innovative animations and visualisations. He also showcased this with stunning videos that he played while talking about the how and the why of the entire creative process. 

Moving on to a more instructive session, Abhishek Nandy chief data scientist at PrediQt Business Solutions Pvt. Ltd and Intel Corporation certified instructor, gave a workshop on the AI kit tailored for Intel® architecture, specifically for data scientists and AI developers on how they can deploy AI models, particularly LLMs seamlessly. 

He also touched upon Intel® Developer Cloud, access to Ponte Vecchio instances, and practical demos using LLMs with Langchain and OpenAI Stable Diffusion.

Moumita Sarker, from Deloitte, spoke about how organisations can effectively balance the need for speed to release in the market and the requirement for robust AI model development and testing. While the latter does take time there is an urgency to have a product. She shared that, “AI has to be served in modules, organisations have to make it easily pluggable. Spend more time on the pilot, spend more time on the user co-creation. Make it valuable, give them the option to customise.”

One of the panel discussions explored the topic of how AI has barged in on every field into everyone’s life and work place. The conversation was between industry experts Akanksha Singh, Jayachandran Ramachandran, Vinodh Ramachandran and Chirag Jain

The discussion was largely around how the adoption of AI has affected employees and how to navigate this introduction positively. 

In addition to this, it also touched on the legal, academic, and enterprise perspectives and the panellists also elaborated on the best work practices. 

“The management should provide resources and clearly explain the vision of the organisation. It is the responsibility of anyone in the leadership position to instil faith and rally the organisation towards a common goal,” Vinodh Ramachandran clearly summed up.

Lastly, Jonty Rhodes, the South African cricketer and legendary fielder, spoke on the role of gathering data and analytics that gives players an edge in cricket. The retired player is a coach to IPL teams who touched upon different aspects of analysing players. This inevitably improves strategy, he said but not without the perils of too much information, which leads to a decision paralysis. 

Humble Rhodes was in high spirits and spoke about his love for India. He also gave away the Minsky Awards to some of the exemplary leaders and companies in AI and analytics for their contribution and impact.

The post Cypher 2023: Key Highlights (Day 1) appeared first on AIM.

]]>
Crunching Time to Market with Agile AI Solutions https://analyticsindiamag.com/ai-origins-evolution/crunching-time-to-market-with-agile-ai-solutions/ Tue, 31 Jan 2023 04:00:00 +0000 https://analyticsindiamag.com/?p=10086202 Crunching time to market with agile AI solutions

The advent of pre-built AI solutions makes it possible for businesses to work with agile models and reduce the time to show business impact by 60% compared to the traditional way of approaching AI/ML solutions.

The post Crunching Time to Market with Agile AI Solutions appeared first on AIM.

]]>
Crunching time to market with agile AI solutions

AI is now a dominant driver for many industry-wide use cases. However, despite its ubiquitousness, AI implementations face quite a few stubborn challenges, such as time-to-market and scaling. These challenges can be addressed by shortening the overall development cycle of an AI solution through the various stages, such as identifying the problem area, acquiring data, preparing the model, running it, testing it and finally developing it on the scale, taking up to 4-5 months to show business impact. In a progressively competitive and fast-evolving data-driven world, it is becoming more and more necessary to be agile in adopting AI-driven solutions.

Organizations are trying to get around this problem by reusing highly customized in-house developed solutions across problem areas. But building customized solutions comes with challenges in decision-making and individual efforts spent on each problem. In addition, it requires iterations, validations, and coordination across multiple technology and business teams. Implementing product-based solutions also has associated logistical challenges such as licensing, IP rights and other paper pushing. 

Pre-built AI as a faster, more reliable, and cost-effective solution

Using pre-built specialized targeted solutions that can be quickly customized to different datasets is known to reduce the turnaround time and be cheaper and more dependable. Furthermore, since the solution is pre-built, it also solves the logistical challenges for the end user while also making it easier for them to use the core solution across multiple problem statements. 

Pre-built solutions also offer speed and agility through quicker model building and scaling using MLOps, creating a loop of continuous testing and improvement of AI models. With pre-built AI solutions, modularity is also an added advantage. Since the solutions or parts of solutions are originally designed to fit in with each other, it is easier to add new capabilities and features over an existing one in a comparatively shorter time. 

Also, traditionally, to build a new solution, the technical team had to consider existing infrastructure, tech stack and other nitty-gritty. But recent advances in open-source technologies enable service providers to make largely technology-agnostic solutions, making pre-built solutions an easier plug-and-play alternative. 

That being said, it is not a cookie-cutter approach. Problem identification and diagnosis still need business acumen and perspective, which requires competent functional expertise. There also needs to be dedicated efforts spent on exploratory data analysis (EDA). Moreover, the standardized nature of these solutions necessitates the templatization of input and output data. Even then, the pros far outweigh the cons, leading to major data solution providers drifting towards pre-built AI solutions. 

Deploying pre-built AI solutions

Companies are adopting pre-built AI solutions across use cases such as marketing, promotion optimization, pricing, and budget decisions. Forecasting is also one area of interest for data solution companies which could be on the supply side (the price of commodities) or the demand side (sales, volume). 

For instance, in the traditional approach to demand forecasting, a firm, either via an external vendor or in-house forecasting department, had to understand the historical demand data, prepare it, and choose algorithms/models which work best (some examples are regression, time series, ARIMA, etc.), and then finally validate the outcome. This is even before tackling the scaling phase and deploying, which comes with its own sets of problems. However, since the core approach to modelling a demand forecast is quite similar, it is one of the best use cases to apply the modular approach of a pre-built AI solution. Once the historical demand input data is standardized to be fed into the model, only the validation and testing need human intervention. Implementing MLOps also ensures that the model is self-correcting with the availability of new data as opposed to the traditional approach, which requires turning occasionally.

As analytics and AI permeate more and more into the business world, firms will need faster results to see the impact. The advent of pre-built AI solutions makes it possible for businesses to work with agile models and reduce the time to show business impact by 60% compared to the traditional way of approaching AI/ML solutions.

The post Crunching Time to Market with Agile AI Solutions appeared first on AIM.

]]>
Meet the winners of Deloitte and MachineHack’s “Machine Learning Challenge” https://analyticsindiamag.com/creative-ai/meet-the-winners-of-deloitte-and-machinehacks-machine-learning-challenge/ https://analyticsindiamag.com/creative-ai/meet-the-winners-of-deloitte-and-machinehacks-machine-learning-challenge/#respond Mon, 17 Jan 2022 12:30:00 +0000 https://analyticsindiamag.com/?p=10058589 Meet the winners of Deloitte and MachineHack’s “Machine Learning Challenge”

The Machine Learning Challenge focused on various attributes such as funded amount, location, loan, balance, etc., to predict if a person will be a loan defaulter or not.

The post Meet the winners of Deloitte and MachineHack’s “Machine Learning Challenge” appeared first on AIM.

]]>
Meet the winners of Deloitte and MachineHack’s “Machine Learning Challenge”

Bad loans are a menace that weakens our financial system. In a bid to solve the loan defaulter problem, Deloitte teamed up with MachineHack to organise a hackathon for data scientists and machine learning practitioners called “Machine Learning Challenge” from November 29 to December 13, 2021. The hackathon focused on various attributes such as funded amount, location, loan, balance, etc., to predict if a person will be a loan defaulter or not. 

The winners of this hackathon have been declared, who will take home cash prizes worth up to INR 1 lakh. Let’s get to know these top 5 leaderboard winners and their methods to ace the hackathon.

Rank 01: Chandrashekhar

Chandrashekhar has been crowned the winner of the Deloitte hackathon. One of his professors encouraged him to pursue a career in data science after Chandrashekhar graduated in information technology. He was always a math person and saw a close relationship between data science and maths. So he pursued a six-month course in data science, which changed his perception of data.

Approach

Chandrashekhar says he focused more on feature engineering to win this competition. He decided to generate more features instead of model development. He extracted various features from the existing variables and generated binning, PCA, and arithmetic features from the numerical columns and combined features using categorical columns.

Chandrashekhar adds, “I have generated aggregation features among the categorical and numerical columns. Then, I used Optuna to tune the model (lgbm) and extracted useful features using lgbm feature_importance. Finally, I built stratified_cv fold to predict their results and used the same method for Catboost and Gradientboost. Then, Ensembling the three models gave the best solution.”

Chandrashekhar calls it a good experience to participate in the hackathon. He feels that he has been participating in MachineHack hackathons, improving his fundamentals. He says, “I strongly believe MachineHack is providing a great platform for data science aspirants. This platform conducts hackathons on a variety of problems, and I met some cool people through MachineHack.”

Check out the solution here

Rank 02: Felipe Carneiro Wanderley

Felipe graduated in Production Engineering. Presently, he works as a bid analyst in a government company named IBGE. However, it has been around nine months since he started studying machine learning. So he searched for an area that joined programming and statistics and found data science perfectly fitting.

Approach

Felipe says he invested most of his time preprocessing the data, especially in feature engineering. Three variables were engineered (Batch Enrolled, Grade and Loan Title), four encoded using the label encoder (Sub Grade, Employment Duration, Verification Status and Initial List Status), and four excluded (ID, Application Type, Accounts Delinquent and Payment Plan).

He adds, “For the features engineered, I used an ordinal encoding by calculating the percentage of defaulters for each class and ordering these. I tried to use the target encoding, but the model became overfitted, and the Log Loss metric for validation got worse. One Hot encoding also did not bring good results for the model.”

In terms of preprocessing, since the model that he used was a tree-based model, it wasn’t necessary to use normalisation/standardisation for the data. However, the target is imbalanced, so it was necessary to use SMOTE oversample. The sampling_strategy chosen was 0.11.

He says that before using the SMOTE, the mean of the probability that his model brought was something between 9.5~10.5. However, from the feedback given from the public leaderboard, using the Log Loss metric and using the mean of his submission, he could calculate the mean of the public leaderboard. The mean calculated was 11.3~12.0.

He adds, “To test my theory, I submitted to achieve the Dumb-Log Loss, using only the value I assumed to be the proportion of Loan defaulters, that is 0.1167. Before submitting, I calculated the Log Loss of a submission that only contained the value of 0.1167 and got 0.3603. This solution (Sub 9) brought me a Log Loss of 0.36024, confirming my theory.”

As the mean of his solution was a little displaced, he decided to use the SMOTE Sampling_strategy of 0.11 that shifted the mean of his submission to the proper range. In addition, he used a tree-based model such that the presence of multicollinearity tends to 0.

He used different algorithms such as Random Forest Classifier, KneighborsClassifier, Logistic Regression, XGB Classifier, Voting Classifier with these combinations, etc. The one that produced the best results was Random Forest Classifier. After submitting the RandomForest baseline, he used the GridSearch to choose the best parameters. 

Wanderley adds, “For the model evaluation, I used cross-validation with five folds. Additionally, I used a pipeline so that I could aggregate both transformers, SMOTE and Random Forest Classifier.”

His first contact with MachineHack was through a data science course in Brazil. The aim was to participate in such a hackathon and achieve the top 100 positions on the leaderboard. He chose the competition “Prediction House Prices in Bengaluru” and got fourth place.

Check out the solution here

Rank 03: Tapas Das

Tapas is presently working as a delivery manager in The Math Company. He is a machine/deep learning enthusiast and first got interested in this subject area in 2018. He went through different MOOCs like the Andrew Ng ML course and the Deep Learning Specialization course on Coursera. He also spent a significant amount of time learning Python programming basics and then started picking diverse types of projects from various online sources like Kaggle, HackerEarth, Driven Data and slowly got comfortable with the analytical mindset and data science approach.

Approach

He started with basic EDA to explore the dataset and modified the data distribution for a few continuous (Funded Amount Investor, Interest Rate, Home Ownership, etc.) and categorical (Loan Title) variables. Next, he used Label Encoder to encode the categorical variables and used the Feature Tools library to generate 200 new features.

He adds, “Finally, I used a weighted average ensemble of LightGBM, CatBoost and XGBoost models to generate the final predictions. Also, I used the Optuna library for hyperparameters search for the different models.”

Tapas has been participating in different hackathons on the MachineHack platform for a while now. He says, “I love how the different problems mimic the real-world scenarios, which helps in a deeper understanding of that domain. Also, it’s fun to compete with the greatest minds in the area of data science. It really brings out the best in me.”

Check out the solution here

Rank 04: Vivek Kumar

Kumar says he gets quickly if he is not exploring new things. He likes data science because it allows him to explore, iterate, and learn from different problems and experiments, just like we human beings learn. He added, “When I started learning, I realised that data science skills could be applied to many different domain areas, so I started making slow and steady progress by taking different training courses, participating in different hackathons.”

Approach

Kumar’s model consisted of technical steps and a focus on understanding the business process and data understanding before going to the data preparation, modelling, and evaluation steps.


In terms of data analysis, there were 67463 rows and 35 features (26 numerical and nine categorical) in the training data. No missing and duplicate entries are found for training and testing data. Train and Test data distribution for most of the features was similar except for Revolving Balance, Collection Recovery Fee, Accounts Delinquent, Total Collection Amount, and Total Current Balance. The collection and recovery based features have shown a positive relationship with the target outcome.

Target Data Analysis

The target variable (Loan Status) was highly imbalanced.

Missing value analysis

No missing values were found in the train and test datasets.

Transformation

Log transformation was applied for all the numeric features. Tree-based models like XGBoost were not sensitive to monotonic transformations, but log transformations for all numeric features helped improve the cross-validation and leaderboard score. 

Features engineering

New features were generated based on the interaction of some of the features. Kumar performed a Train/Test distribution check for each of the new features. Only the features showing a similar distribution across both train and test data have been included for model training to avoid any prediction drift. This allowed him to restrict the feature list.

  • Calculated the sum of ‘Recoveries’ and ‘Collection Recovery Fee.’
  • Calculated the sum of ‘Total Collection Amount’ and ‘Total Received Late Fee.’
  • Created interest rate category based on ‘Interest Rate.’

Categorical encoding

Kumar also performed Frequency encoding for the categorical features. It was done inside the cross-validation loop during model training and validation to avoid data leakage.

Model training and validation approach

Kumar chose the gradient boosting algorithm as a final set of algorithms because this works best to identify non-linear relationships between features.

  • Spot training/validation was performed for 12 machine algorithms (i.e., a combination of a linear and non-linear set of algorithms). This helped him understand what types of models can uncover the pattern for making predictions. The Pycaret package provides a very good and easy way to perform this activity.
  • Based on the spot testing done in the 1st step, the XGBoost model was used as a final model for training/evaluation and prediction. 
  • The model has been trained/evaluated based on different training/validation split data based on a five-fold cross-validation technique. In this way, I ensured that the original training dataset was used for both training and validation, which helped improve the model robustness due to the diversity of training/validation data split.
  • The test predictions are generated in each of the five folds and then averaged to generate final test predictions. 
  • Local validation score variation based on cross-validation can show a similar trend on leader board score. This has helped him build a robust validation strategy and allowed him to experiment with different aspects of the model building like validating new features, new encoding techniques, hyperparameter tuning, etc. The cross-validation score is comparable with the leaderboard.
  • Out of the Fold Log loss metric: 0.30822
  • Public Leaderboard Log loss: 0.35350
  • Private Leaderboard Log loss: 0.34099

The Optuna package was used for the hyperparameter tuning.

Kumar said, “We can see that the features Loan Amount and Collection recovery fee have been given the highest importance score among all the features as it has been used many times by XGBoost for the split. It would be unfair to make any business decision based on XGBoost feature importance compared to other well established available techniques used for model interpretability, like SHAP. This is based on game theory backed by a solid mathematical foundation to justify the rationale behind global feature importance.

SHAP

Kumar added that the mean absolute value of the SHAP values for each feature is taken to get a standard bar plot. 

The SHAP library provides easy ways to aggregate and plot the Shapely values for a set of points (in our case, the validation set) to have a global explanation for the model.

The new feature generated based on the summation of collection amount and received late fee has been given the highest importance score among all the features. The higher the value of Collection_amt_plus_received_late_fee, the more impact it will have on the loan default, which is intuitive.

In terms of his experience with MachineHack, Kumar said that the team is very supportive. He says, “It felt quite satisfying by solving some of the challenging problems faced by the industry, and MachineHack made it possible by bringing those problems to a platform where different hackers compete to solve them. Thank you, MachineHack.”

Check out the solution here

Rank 05: Rahul Pednekar

Pednekar has always been passionate about new technologies, especially data science, AI and machine learning. His expertise lies in creating data visualisations to tell his data’s story & using feature engineering to add new features to give a human touch in the world of machine learning algorithms. In addition, he is very interested in developing software that solves real-world problems by leveraging data to make efficient decisions by predicting the future. 

He heavily utilises Python to clean, analyse, and perform machine learning on data and has over 19 years of work experience in IT, project management, software development, application support, software system design, and requirement study.

Approach

He started with outlier removal and kept only those rows in training data where – “Collection Recovery Fee” < 55, “Total Current Balance” < 1000000 and “Total Received Late Fee” < 40.

For data type conversion, he converted the following numeric data type columns into Object data types as they were categorical columns: 

  • Term
  • Delinquency – two years
  • Inquiries – six months
  • Public Record
  • Open Account
  • Total Accounts
  • Last week Pay

Then, he converted the Target column “Loan Status” from the object data type into a numeric data type.

A few rows in the test dataset were not present in the training dataset. Therefore, he replaced them with values that are present in both train & test datasets.

  • Column name = “Term”: Replace “60” with “59” 
  • Column name = “Delinquency – two years”: Replace “9” with “8” 
  • Column name = “Total Accounts”: Replace “73” with “72”

The column “Loan Title” contains many duplicate values. This column is cleaned using the following method. The final 16 categories from column “Loan Title” were formed from various different categories by combining them into one of the 16 categories.

  • Personal_Loan: All types of personal loans
  • Vacation_Loan: Any loans taken for vacation
  • Home_Loan: Any loan is taken for buying a new home or renovation of the existing home 
  • Medical_Loan: Loan taken for medical purpose
  • Debt_Consolidation_Loan: Loan is taken to consolidate existing debt
  • Consolidation_Loan: All types of consolidation loans
  • Credit_Card_Consolidation: All types of credit card consolidation loans
  • Debt_Free: Loan is taken to become debt-free
  • CREDIT_CARDS: Loan taken over credit cards
  • REFI_LOAN: Loan is taken to refinance existing loans
  • Other_Loans: All other types of loans 
  • CAR_LOAN: Loan taken for the card
  • Major purchase: Loan taken for major buying 
  • Business: Any type of business loans
  • Moving and relocation: Loan is taken for moving and relocation
  • Other: Any other type of loan

Then, a new column, “Loan Type”, with the above 16 categories, is created. Finally, he dropped four columns (ID, Payment Plan, Loan Title and Accounts Delinquent) as they were not adding any value.

Modelling and Prediction

For this, Pednekar used:

  • One Hot Encoding: Use get_dummies() function to form around 400 columns for final modelling.
  • RandomizedSearchCV to find the best hyperparameters using RandomForest Regressor
  • Predict the test data using the best model given by the above hyperparameter tuned RandomForest Regressor. 

In terms of his experience at MachineHack, Pednekar says, “I would like to thank MachineHack for providing me with the opportunity to participate in the Deloitte Machine Learning Challenge. It has been a wonderful learning experience, and I would like to participate in future hackathons. I would encourage them to organise many more hackathons.”

Check out the solution here.

The post Meet the winners of Deloitte and MachineHack’s “Machine Learning Challenge” appeared first on AIM.

]]>
https://analyticsindiamag.com/creative-ai/meet-the-winners-of-deloitte-and-machinehacks-machine-learning-challenge/feed/ 0
Deloitte’s AI Institute Launches In India To Accelerate Innovation https://analyticsindiamag.com/ai-news-updates/deloittes-ai-institute-launches-in-india-to-accelerate-innovation/ Tue, 07 Dec 2021 05:19:46 +0000 https://analyticsindiamag.com/?p=10054924

Deloitte has launched an AI Institute in India, focusing on building AI solutions and upskilling to increase the adoption of the technology

The post Deloitte’s AI Institute Launches In India To Accelerate Innovation appeared first on AIM.

]]>

Deloitte has launched an AI Institute to build an ecosystem in the country by integrating AI innovations and research for applications in various organisations. Called the ‘Deloitte AI Institute India,’ it aims to bridge the gap between organisations that embrace AI and those waiting for ‘the future’. Building AI solutions and skilling will be the priority of the institute.

“There are inherent strengths that India can leverage. At the moment, start-ups, research and academic institutions, and businesses themselves are heavily invested in discovering multiple new AI-based innovative solutions. “Deloitte AI Institute India” will integrate these efforts to help organisations transform quickly with AI”, said Romal Shetty, President, Consulting, at Deloitte Touche Tohmatsu India LLP.

The institute in India will collaborate with other institutes and share best practices, insights,  research, case studies, and point of view, in addition to building AI solutions for India and the world.

“A dedicated AI team at Deloitte, along with domain and industry experts, will drive stronger outcomes. We will also bring our business lens that weighs ethical considerations and accountability to help recognise biases or risks. AI will be a part of everything we do as an organisation,” said Saurabh Kumar, Partner, Deloitte Touche Tohmatsu India LLP.

The first Deloitte AI Institute was launched in the US in June 2020. Since then, it has been successfully launched in other countries, such as Canada, the UK, Germany, China, and Australia.

The post Deloitte’s AI Institute Launches In India To Accelerate Innovation appeared first on AIM.

]]>
8 Companies with Highest Salaries for Data Scientists In India https://analyticsindiamag.com/innovation-in-ai/8-companies-offering-the-highest-salaries-to-data-scientists-in-india/ Thu, 14 Oct 2021 07:30:00 +0000 https://analyticsindiamag.com/?p=10051346 data science

Data is the new oil. Companies across industries realise the importance of using data to analyse performance and predict the future to ultimately ease up the decision-making process, as a result of which, data scientists have become an indispensable part of every organisation, irrespective of its industry, type and size.  This has resulted in a […]

The post 8 Companies with Highest Salaries for Data Scientists In India appeared first on AIM.

]]>
data science

Data is the new oil. Companies across industries realise the importance of using data to analyse performance and predict the future to ultimately ease up the decision-making process, as a result of which, data scientists have become an indispensable part of every organisation, irrespective of its industry, type and size. 

This has resulted in a spike in demand for data science jobs in the last couple of years. Data scientists are one of the highest-paid employees of most companies. According to Analytics India Magazine research, around 1,400 data science professionals working in India make more than Rs 1 crore salary. In fact, as per data available with job search and company review platform Glassdoor, the national average salary for data scientists in India is Rs 10 lakh a year. 

We have curated a list of companies offering the highest data science jobs in India.

Top 8 Companies with Highest Salaries for Data Scientists

NameStarting SalaryAverage Salary
AmazonRs 5 lakhRs 15.56 lakh
FlipkartRs 14.5 lakhRs 24.2 lakh
WalmartRs 14.5 lakhRs 24.6 lakh
Hewlett Packard EnterpriseRs 10 lakhRs 21,28,671
IBMRs 10 lakhRs 10.91 lakh
DeloitteRs 5.5 lakh Rs 12.41 lakh
AccentureRs 1.8 lakhRs 10.20 lakh
[24]7.aiRs 11.2 lakhRs 19.31 lakh

1. Amazon

Global e-commerce giant Amazon requires data science professionals tasks ranging from supply chain optimisation, inventory and sales forecasting to detection of frauds. According to the tech giant, data scientists at Amazon form the link between business and technical sides. 

Salary: Rs 5 lakh to Rs 45.57 lakh | Rs 15.56 lakh (average) 

2. Flipkart 

Indian e-commerce giant Flipkart employs data scientists across teams. Their roles vary from building Systemic Intelligence across Flipkart products to uncovering and redefining shopping trends in the country. The data science team designs capabilities and solutions across the supply chain, fintech, consumer experience, search and discovery, demand shaping, and fraud modelling. 

Salary: Rs 14.5 lakh to Rs 42 lakh | Rs 24.2 lakh (average)

Check the data science roles available at Flipkart here

3. Walmart 

Flipkart’s parent company Walmart has about 29 wholesale stores spread across the country under the name of ‘Best Price’. Data science professionals at Walmart find new and innovative ways of applying data across businesses while leveraging automation to improve processes. Data scientists at Walmart are responsible for mixing technology and retail. 

Salary: Rs 14.5 lakh to Rs 33.5 lakh | Rs 24.6 lakh (average) 

Check the data science roles available at Walmart here

4. Hewlett Packard Enterprise

Hewlett Packard Enterprise, or HPE, aims to help customers manage and assess their data. It provides them with advanced tech solutions, services and consumption models to work on the available data. 

Salary: Rs 10 lakh to Rs 31 lakh | Rs 21,28,671 (average)  

5. IBM

American multinational tech company IBM was named a leader in the 2021 Gartner Magic Quadrant for data science and machine learning platforms. Its products include the IBM Watson Studio, IBM Cloud Pak for Data, IBM Decision Optimisation and IBM SPSS Modeler– tools used by data scientists and developers. 

Salary: Rs 1 lakh to Rs 44.62 lakh | Rs 10.91 lakh (average)

Check the data science job openings at IBM here

6. Deloitte 

Deloitte provides end-to-end business solutions for data science practitioners. It claims that analytics is in its DNA, and thus, Deloitte’s expertise lies in providing smart insights for stronger outcomes. 

Salary: Rs 5.52 lakh to Rs 27 lakh | Rs 12.41 lakh (average)

Check the data science job openings at Deloitte here

7. Accenture 

Accenture has a diverse team of data scientists and AI experts to leverage the power of AI and bring in changes in companies and in society at large. Its team of data scientists apply ethical and responsible AI to measure the impact of businesses. 

Salary: Rs 1.87 lakh to Rs 31 lakh | Rs 10.20 lakh (average)

Check the data science job openings at Accenture here

8. [24]7.ai

Customer experience software company [24]7.ai uses AI and ML to understand and analyse customer intent. It helps companies create personalised, predictive and seamless customer experiences across channels. Data science professionals at [24]7.ai are responsible for building and defining AI models into products and offerings. 

Salary: Rs 11.27 lakh to Rs 23 lakh | Rs 19.31 lakh (average)

Check the data science roles available at [24]7.ai here

**Salaries as per Glassdoor reviews. 

The post 8 Companies with Highest Salaries for Data Scientists In India appeared first on AIM.

]]>
What Do Deloitte & Accenture Bumper Results Speak Of The IT Outsourcing Market https://analyticsindiamag.com/ai-origins-evolution/deloitte-accenture-it-outsourcing-market/ Mon, 11 Oct 2021 08:30:00 +0000 https://analyticsindiamag.com/?p=10051164 What Do Deloitte & Accenture Bumper Results Speak Of The IT Outsourcing Market

Ireland-based Accenture and UK-based Deloitte, two big names in IT services, have reported robust revenue growth exceeding $50 billion in FY21. Accenture, in its report, has announced its revenue to be $50.5 billion, which saw an increase of 14% in US dollars compared with the fiscal year 2020. Similarly, Deloitte reached $50.2 billion in revenue […]

The post What Do Deloitte & Accenture Bumper Results Speak Of The IT Outsourcing Market appeared first on AIM.

]]>
What Do Deloitte & Accenture Bumper Results Speak Of The IT Outsourcing Market

Ireland-based Accenture and UK-based Deloitte, two big names in IT services, have reported robust revenue growth exceeding $50 billion in FY21. Accenture, in its report, has announced its revenue to be $50.5 billion, which saw an increase of 14% in US dollars compared with the fiscal year 2020. Similarly, Deloitte reached $50.2 billion in revenue with an increase of 5.5 per cent. 

The tremendous revenue figures reported by both Accenture and Deloitte indicate how well the consulting and outsourcing business is thriving despite the COVID-19 crisis. Take, for instance, the outsourcing revenues out of the total revenue for Accenture were $23.2 billion, which is more than 45 per cent of the total revenue earned. 

India’s Outsourcing Market’s Situation

Taking India into consideration, compared to the IT-BPM business, India’s global sourcing market continues to grow faster. India is the world’s most popular sourcing destination, with a market share of around 55 per cent of the US$ 200-250 billion global services sourcing industry in 2019-20. So, will it be an exaggeration to claim that the demand environment for Indian IT companies – which accounted for 8% of India’s GDP last year – looks strong and promising? 

India’s IT sector market is expected to reach $100 billion by 2025. Moreover, COVID -19 has compelled businesses worldwide to make modifications to adapt to the “Work from Home” setup with minimal interruption to their business models. This prompted enterprises to switch from a captive to an outsourced model, which is projected to help Indian IT firms. As a result, Indian IT majors, including TCS, Wipro, Infosys, are on a hiring spree with over 160% year-on-year growth in hiring the talent pool for June 2021.

Microsoft, Cisco, American Express, NatWest group, and Google have already outsourced their IT services to India simply because the country is ready with the required resources to meet the software demands of the world. Also, the government actively promotes business efforts to lure international investors and popularise IoT ecosystems and IT hardware development, which gives India an edge as an outsourcing destination. In addition to the reasons mentioned above, let’s understand the latest trends that are going to shape the outsourcing market in the coming years:

  • Businesses willing to increase their outsourcing budget: Companies seeking IT services are in dire need to adopt quality over quantity. Enterprises in the IT sector have indicated increasing the budget over time, as per the latest report from NASSCOM and McKinsey titled “Future of Technology Services – Navigating the New Normal.”
  • Talent Shortage: With the recent digital transformation wave, new technologies have entered the market within no time. Latest technologies in AI, ML, data science, and cybersecurity have forced companies to look for a skilled talent pool. However, there is a shortage of tech expertise in their respective countries; companies are looking to outsource their IT services.
  • COVID-19 has fuelled digital spending: Apart from the fierce competition and growing technology, the pandemic has forced companies to digitalise their operations. For instance, Microsoft’s CEO Satya Nadela mentioned that two years of digital transformation has happened in just two months.
  • Companies are moving towards the agile model: Businesses in the year 2021 and after that will opt for an agile development approach. With low overhead cost, timely delivery, better communication and a good talent pool, outsourcing remains a lucrative approach for enterprises in the long run.

Conclusion

Since the early 2000s, the Indian Information Communication Technology industry has maintained consistent annual growth rates. Information Technology-enabled Services (ITeS), IT hardware manufacture, Software as a Service (SaaS) distribution, etc., are all branches of the local ICT business. India is a significant player in the Asia-Pacific area, despite the fact that most people equate global IT hardware production with China. Prominent international companies, including IBM, Dell, and Hewlett Packard, chose India as their R&D location years ago. 

The pandemic has not only disrupted the IT outsourcing business, but it transformed how people thought about outsourcing and how they hired individuals for the task. However, as most firms scale and accelerate their digital transformation projects, Indian IT companies win new business. As a result, the Indian IT outsourcing business is expected to grow in revenue.

The post What Do Deloitte & Accenture Bumper Results Speak Of The IT Outsourcing Market appeared first on AIM.

]]>
Prashanth Kaddi https://analyticsindiamag.com/intellectual-ai-discussions/prashanth-kaddi/ Wed, 22 Sep 2021 05:38:07 +0000 https://analyticsindiamag.com/?p=10049198 Prashanth Kaddi, Partner, Analytics & Cognitive at Deloitte

Prashanth Kaddi (aka Kaddi) is an Analytics thought leader and is known as an evangelist for not just industrialisation of analytics solutions but ensuring organisations extract business value from data science initiatives. With a career spanning more than two decades, he has worked across consulting, banking, and co-founding a startup – each role involving building […]

The post Prashanth Kaddi appeared first on AIM.

]]>
Prashanth Kaddi, Partner, Analytics & Cognitive at Deloitte

Prashanth Kaddi (aka Kaddi) is an Analytics thought leader and is known as an evangelist for not just industrialisation of analytics solutions but ensuring organisations extract business value from data science initiatives. With a career spanning more than two decades, he has worked across consulting, banking, and co-founding a startup – each role involving building new businesses and teams from the ground up.

As Partner at Deloitte, Kaddi has been leading the charge of setting up and growing the data science practice as well as the broader data analytics growth journey. In his role, he has driven business growth, new capabilities and delivered to the analytics practice growth ambitions. Besides, he has led extremely successful outcomes for clients across sectors both in India and globally. As a result, enterprises have transformed business models, increased top line and bottom line, deepened customer engagement, or managed risk better through AI/ML solutions landed to their business. He is also in the global leadership team of the Deloitte AI Institute, driving AI as a strategic growth engine.

Kaddi is a regularly featured author and speaker on analytics topics and events. He is an alumnus of NIT Suratkal and IIM Ahmedabad.


The post Prashanth Kaddi appeared first on AIM.

]]>
Deloitte CEO Commits To Get 20,000 Oxygen Concentrators To India In The Next Few Weeks https://analyticsindiamag.com/ai-news-updates/deloitte-ceo-commits-to-get-20000-oxygen-concentrators-to-india-in-the-next-few-weeks/ Tue, 27 Apr 2021 12:44:24 +0000 https://analyticsindiamag.com/?p=10039067

With a coalition of multinational companies, Punit Renjen, the Global CEO of Deloitte announced on a LinkedIn post that more than forty CEOs of multinational companies have come together to help India fight against the second wave.  Renjen stated, “The images from my homeland have pained us all. My thoughts are for my mother and […]

The post Deloitte CEO Commits To Get 20,000 Oxygen Concentrators To India In The Next Few Weeks appeared first on AIM.

]]>

With a coalition of multinational companies, Punit Renjen, the Global CEO of Deloitte announced on a LinkedIn post that more than forty CEOs of multinational companies have come together to help India fight against the second wave. 

Renjen stated, “The images from my homeland have pained us all. My thoughts are for my mother and family in Haryana and my professional family of over 50,000 Deloitte India colleagues, many of whom have been impacted by the pandemic’s frightening spiral. The way to fight it is to respond together against a virus that doesn’t discriminate against anyone. This is a global crisis and we know that nobody is safe till everyone is safe.”

“I’ve spent the weekend working with the US-India Strategic Partnership Forum, the U.S.-India Business Council, the Business Roundtable and India’s Ambassador to the US to mobilize resources. More than forty CEOs of multinational companies came together this weekend to focus on immediate needs like oxygen concentrators, oxygen cylinders and generators, home monitoring kits and critical medicines,” Renjen added.

According to sources, “It is a collective initiative of the U.S.-India Business Council of the U.S. Chamber of Commerce, and the U.S.-India Strategic and Partnership Forum and Business Roundtable.”

The initiative will focus on immediate needs like oxygen concentrators, oxygen cylinders and generators, home monitoring kits and critical medicines for India. The CEO mentioned that he has already arranged about 1,000 oxygen concentrators and an additional 11,000 are being sourced from fellow CEO colleagues.

Renjen stated, “We are all very encouraged by the commitment of the US government and I and my CEO colleagues will spend the coming days advocating for further support from both government and the private sector. I hope that the 1,000 oxygen concentrators provided by Deloitte today and an additional 11,000 being sourced by the end of this week from my fellow CEO colleagues, will help the wider international effort to assist the people of India. India will prevail.”

The post Deloitte CEO Commits To Get 20,000 Oxygen Concentrators To India In The Next Few Weeks appeared first on AIM.

]]>
Deloitte Deploys NVIDIA’s DGX A100s For Its New AI Computing Centre https://analyticsindiamag.com/ai-news-updates/deloitte-deploys-nvidias-dgx-a100s-for-its-new-ai-computing-centre/ Thu, 04 Mar 2021 06:54:53 +0000 https://analyticsindiamag.com/?p=10021451 Deloitte Deploys NVIDIA’s DGX A100s For Its New AI Computing Centre

Deloitte has announced the launch of the first-of-its-kind centre for AI computing, which has outfitted NVIDIA’s DGX A100 systems to create a supercomputing architecture.  Designed to accelerate AI development for its clients, this new Deloitte Centre for AI Computing will help clients explore various AI strategies to become AI-fuelled organisations. The centre aims to continue the growth […]

The post Deloitte Deploys NVIDIA’s DGX A100s For Its New AI Computing Centre appeared first on AIM.

]]>
Deloitte Deploys NVIDIA’s DGX A100s For Its New AI Computing Centre

Deloitte has announced the launch of the first-of-its-kind centre for AI computing, which has outfitted NVIDIA’s DGX A100 systems to create a supercomputing architecture. 

Designed to accelerate AI development for its clients, this new Deloitte Centre for AI Computing will help clients explore various AI strategies to become AI-fuelled organisations. The centre aims to continue the growth of artificial intelligence throughout enterprise IT. 

The New York-based consulting giant stated in its official statement that the new centre’s accelerated computing platforms would feature NVIDIA graphics processing units, along with its Mellanox networking and NVIDIA software capabilities to transform data processing, analytics and AI.

A recent survey by Deloitte noted that more than half of respondents reported spending more than $20 million over the past year on artificial intelligence technology and talent. Thus, the Deloitte Center for AI Computing has been designed to deliver an accelerated platform for expediting the development of new AI applications.

Jason Girzadas, the managing principal at Deloitte Consulting, stated in the official statement that the Deloitte Center for AI Computing would bring together Deloitte’s deep AI experience with NVIDIA DGX A100 systems’ powerful supercomputing capabilities to accelerate the journey from AI experimentation. 

He said, “Our collaboration with NVIDIA can enable clients to quickly deliver on the full promise of AI solutions to transform both their businesses and the basis of professional services.”

Deloitte stated to the media that the centre significantly expands its ability to develop AI systems in the AI Exploration Lab in Austin, Texas, and the AI Factory in Canada.

Adding to this, NVIDIA CEO Jensen Huang mentioned that the company is moving from research labs into the industry, and the partnership with Deloitte will supercharge its reach.

He said, while every industry is getting transformed by AI, with products and services getting revolutionised by AI, companies will become learning machines by supporting people with AI. “Together with Deloitte’s global force of experienced specialists, we will turbocharge the realisation of this vision.”

Deloitte said that the DGX A100 systems would be used to develop AI applications for industries like technology, media and telecommunications, government and public services, life sciences and health care, auto and transportation, financial services, and the energy sector.

The firm said the new Deloitte Centre for AI Computing would provide these organisations with a platform, experience and computing resources to speed the development of a wide range of AI applications, from autonomous vehicles to digital contact centres to public sector innovation.

Additionally, the centre will be collaborating with the Deloitte AI Institute to support the positive growth and development of AI through conversations and innovative research. The centre will also focus on building ecosystem relationships that help advance human-machine collaboration.

The post Deloitte Deploys NVIDIA’s DGX A100s For Its New AI Computing Centre appeared first on AIM.

]]>
Deloitte Launches The Deloitte AI Institute – A Centre To Advance The Development Of AI For Enterprises https://analyticsindiamag.com/ai-news-updates/deloitte-launches-the-deloitte-ai-institute-a-centre-to-advance-the-development-of-ai-for-enterprises/ Fri, 26 Jun 2020 13:03:15 +0000 https://analyticsindiamag.com/?p=68390

Recently, Deloitte announced the launch of the Deloitte AI Institute – a centre that focuses on artificial intelligence (AI) research, eminence and applied innovation across industries. The institute is said to apply the cutting-edge research to help address a wide spectrum of relevant AI use cases and bridge the ethics gap surrounding AI.  In a […]

The post Deloitte Launches The Deloitte AI Institute – A Centre To Advance The Development Of AI For Enterprises appeared first on AIM.

]]>

Recently, Deloitte announced the launch of the Deloitte AI Institute – a centre that focuses on artificial intelligence (AI) research, eminence and applied innovation across industries. The institute is said to apply the cutting-edge research to help address a wide spectrum of relevant AI use cases and bridge the ethics gap surrounding AI. 

In a blog post, Nitin Mittal, AI co-leader and principal, Deloitte Consulting LLP said, “The Deloitte AI Institute is being established to advance the conversation and development of AI for enterprises.” 

He added, “Our goal is to blend Deloitte’s deep experience in applied AI with a robust network of some of the most intelligent AI minds in the world to challenge the status quo. Through the power of this centre, we aim to deliver impactful and game-changing research; and innovation to help our clients lead in the ‘Age of With,’ a world where humans work side-by-side with machines.”

The network of this institute will include the top industry thought leaders and academic luminaries, start-ups, research and development groups, entrepreneurs, investors as well as innovators.

“With our unique experience, investments in AI and work with top organisations, we believe the Deloitte AI Institute can ignite ground-breaking applied AI solutions for enterprises,” said Beena Ammanath, executive director of Deloitte AI Institute, Deloitte Consulting LLP. “Further, to help enterprises advance with AI, we will aim to help organisations remain distinctively human in a technology-driven world.”

Irfan Saif, Deloitte Risk & Financial Advisory principal, Deloitte & Touche LLP and Deloitte AI co-leader stated that with AI ethics, the AI institute aims to help organisations achieve a positive future by bringing together top stakeholders from all sectors of society to discuss and co-design effective policies and frameworks, such as Deloitte’s Trustworthy AI framework, for governing AI.

The post Deloitte Launches The Deloitte AI Institute – A Centre To Advance The Development Of AI For Enterprises appeared first on AIM.

]]>
Why The Big Four Audit Firms PwC, EY, Deloitte & KPMG Are Investing Heavily In AI https://analyticsindiamag.com/ai-origins-evolution/why-the-big-four-audit-firms-pwc-ey-deloitte-kpmg-are-investing-heavily-in-artificial-intelligenc/ Tue, 07 Jan 2020 13:30:00 +0000 https://analyticsindiamag.com/?p=53328 Big Four-Audit-Artificial-Intelligence

The Big Four audit firms, through a record-level investment of billions into AI technology, are drastically changing the way they have traditionally operated. Tax preparation, auditing and business consultation are the kinds of services that are traditionally dependent on predominantly human capital. But artificial intelligence (AI) now has disrupted these business models, and fundamentally changed […]

The post Why The Big Four Audit Firms PwC, EY, Deloitte & KPMG Are Investing Heavily In AI appeared first on AIM.

]]>
Big Four-Audit-Artificial-Intelligence

The Big Four audit firms, through a record-level investment of billions into AI technology, are drastically changing the way they have traditionally operated.

Tax preparation, auditing and business consultation are the kinds of services that are traditionally dependent on predominantly human capital. But artificial intelligence (AI) now has disrupted these business models, and fundamentally changed the nature of accounting. For example, when the tax authorities make a new regulation, audit companies had to manually re-examine thousands of documents for clients to comply with the new law. By using NLP to extract information with a human-in-the-loop to validate the results makes the AI system more consistent and efficient. 

This is why the Big Four audit firms, through a record-level investment of billions into AI technology, are drastically changing the way they have traditionally operated. Three out of the four largest accounting firms announced $9 billion on artificial intelligence (AI) and data analytics capabilities, according to Bloomberg. The firms’ focus is also on training employees to bring advanced digital solutions into all consulting and audit practices across the firms. 

The technology transformation at big accounting and consultancy firms travels far beyond automating standard auditing and accounting processes. With an emphasis on artificial intelligence, data analytics, and large scale tech training, the most prominent organisations in accounting are making tech a part of their permanent identity, which is the key to survival in the age of AI.

Here We Look At Each Big Four Audit Firm On Their AI Investments

KPMG in December 2019 announced that it would allocate $5 billion for five years in advanced technologies like AI. According to KPMG, the announcement is based on a 2015 strategic decision to make automation and artificial intelligence central to its future. KPMG’s spending is focused around building new cloud-based technology, creating innovative solutions for clients, either in-house or by collaborating with other firms, and lastly training its employees to leverage new technologies like AI and automation. The firm has also been focusing on AI and data analytics that could be utilised both for audit processes and to sell to consulting clients. KPMG’s $5 billion spendings on AI makes it the largest firm among the big four accounting firms to invest in cutting edge technology. 

PWC, on the other hand, announced in September a program to spend $3 billion over the next four years, mainly to train its forces to exploit new technology. To address the skills gap, PwC also launched a digital fitness application to measure the staff’s technology skills around two years ago. The firm also gives what it calls a digital accelerator program that goes deeper into a particular tool or technology, then explores possible uses to take back to their teams and clients. 

Further, Ernst & Young (EY) had announced a two-year investment of $1 billion a year prior in 2018. As part of its plan for expenditure of $1 billion over two years in advanced tech capabilities, EY has been transforming traditional client solutions and started rolling out innovative services. For example, when it comes to artificial intelligence (AI) and natural language processing, EY Document Intelligence is assisting EY teams in evaluating client documents and contracts more efficiently than human auditors. In 2019, the solution had been pushed to the Azure cloud platform and successfully tested with EY assurance teams worldwide on lease accounting change and audit engagements, which decreased processing time up to 90% and increased accuracy by up to 25%. 

While Deloitte has not announced a precise investment number for its investments on tech capabilities, it is working hard to become one of the biggest providers of automation solutions to law firms. Deloitte has been working on the deployment of an artificial intelligence contract analysis tool, Kira, which is an ML-based system. The partnership with Kira Systems helps Deloitte detect what is crucial to reviewers in contracts and then identifies essential information across a massive range of agreements. Deloitte said it had 3,000 active users of Kira and has trained the platform to discover thousands of different data insights for clients. 

Rising Competition From Tech Companies 

Of course, one of the significant reasons why big audit firms have taken AI investments so seriously is because they see a threat from big tech companies — which are riding on the power of open-source innovation. Another factor is that tech companies are already assisting firms in many parts of the finance and banking functions.

A survey report of 150 big companies in the U.S. and U.K. by research company found that about one-third were delegating audit processes to save funds, with another 44% thinking of doing it. About 45% said that would delegate auditing to technology firms. And it’s not just large tech companies that may be competing with audit firms. Even smaller independent software developers are selling AI-based software that is assisting firms in auditing and accounting services. 

Overview

Although the big professional services and accounting firms haven’t become tech companies, technology does remain core to their future. Developing analytics and advanced AI/ML models to derive valuable insights will be more beneficial to audit companies than just traditional processing. With analytics, firms could evaluate the entire set of client’s transactions in real-time, giving them the capability to spot trends and anomalies.

The post Why The Big Four Audit Firms PwC, EY, Deloitte & KPMG Are Investing Heavily In AI appeared first on AIM.

]]>
Deloitte Launches New Public Data Visualization Tool Open Source Compass https://analyticsindiamag.com/ai-news-updates/deloitte-launches-new-public-data-visualization-tool-open-source-compass/ Tue, 24 Sep 2019 05:54:38 +0000 https://analyticsindiamag.com/?p=46294 Deloitte Public Data Visualization Tool Open Source Compass

Deloitte this month announced the launch of a first-of-its-kind public data visualization tool, Open Source Compass (OSC) intended to help C-suite leaders, product managers and software engineers understand the trajectory of open source development and emerging technologies. The information curated and shared through the new data visualization tool will provide insights to inform key business […]

The post Deloitte Launches New Public Data Visualization Tool Open Source Compass appeared first on AIM.

]]>
Deloitte Public Data Visualization Tool Open Source Compass

Deloitte Public Data Visualization Tool Open Source Compass
Image used for representational purpose only.

Deloitte this month announced the launch of a first-of-its-kind public data visualization tool, Open Source Compass (OSC) intended to help C-suite leaders, product managers and software engineers understand the trajectory of open source development and emerging technologies.

The information curated and shared through the new data visualization tool will provide insights to inform key business decisions. OSC evaluates code commits and developer mindshare to better understand technology trends, guides exploration into relevant platforms and languages, and enables visibility into the developer talent landscape.

Designed and developed by Deloitte, Datawheel, and Artificial and Natural Intelligence Toulouse Institute (ANITI) Chair Cesar Hidalgo, OSC analyzes data from the largest open source development platform which brings together over 36 million developers from around the world. OSC visualizes the scale and reach of emerging technology domains — over 100 million repositories/projects — in areas including blockchain, machine learning, and IoT.

Open source software continues to gain traction in the enterprise as a powerful accelerator for digital transformation, value creation and talent strategy; a complement to increasingly strategic cloud providers, enterprise technology partners, and plans to modernize their legacy core technology stack. Open source also continues to deliver an invaluable foundation to startups with more limited resources.  Some of the key benefits of Deloitte’s new open-source analysis tool include:

  • Exploring which specific open source projects are growing or stagnating in domains like machine learning.
  • Identifying potential platforms for prototyping, experimentation, and scaled innovation.
  • Scouting for tech talent in specific technology domains and locations.
  • Detecting and assessing technology risks.
  • Understanding what programming languages are gaining or losing ground to inform training and recruitment.

“Open source software has been around since the early days of the internet and has incited a completely new kind of collaboration and productivity — especially in the realm of emerging technology,” said Bill Briggs, chief technology officer, Deloitte Consulting LLP. “Deloitte’s Open Source Compass can help provide insights that allow organizations to be more deliberate in their approach to innovation, while connecting to a pool of bourgeoning talent.”

Open Source Compass will provide insights into 15 emergent technology domains, including cybersecurity, virtual/augmented reality, serverless computing and machine learning, to name a few. The site will offer a view into systemic trends on how the domains are evolving. The open source platform will also explore geographic trends based on project development, authors and knowledge sharing across cities and countries. It will also track how certain programming languages are being used and how fast they are growing. Free and open to the public, the site enables users to query technology domains of interest, run their own comparisons, and share or download data.

Datawheel, an award-winning company specialised in the creation of data visualisation solutions, designed and developed the platform. “Making sense of large streams of data is one of the most pressing challenges of our day,” said Hidalgo. “In Open Source Compass, we used our latest technologies to create a platform that turns opaque and difficult to understand streams of data into simple and easy to understand visualizations.”

“Open Source Compass can address different organizational needs for different types of users based on their priorities,” said Ragu Gurumurthy, global chief innovation officer for Deloitte Consulting LLP. “A CTO could explore the latest project developments in machine learning to help drive experimentation, while a learning and development leader can find the most popular programming language for robotics that could then be taught as a new skill in an internal course offering.”

The post Deloitte Launches New Public Data Visualization Tool Open Source Compass appeared first on AIM.

]]>
Online Education Vs Traditional Degrees: Students Armed With AI/ML Skills Have An Edge Over The Others https://analyticsindiamag.com/ai-origins-evolution/online-education-vs-traditional-university-degrees-ai-ml/ Thu, 08 Nov 2018 05:18:39 +0000 https://analyticsindiamag.com/?p=29996 education

In the last few years, India has witnessed rapid changes in the educational technology landscape. The spurt of jobs in emerging technologies — artificial intelligence and machine learning has spurred the growth of EdTech companies which are at the forefront of providing cutting-edge skills to young college professionals and undergraduates keen to upskill and re-learn. […]

The post Online Education Vs Traditional Degrees: Students Armed With AI/ML Skills Have An Edge Over The Others appeared first on AIM.

]]>
education

education

In the last few years, India has witnessed rapid changes in the educational technology landscape. The spurt of jobs in emerging technologies — artificial intelligence and machine learning has spurred the growth of EdTech companies which are at the forefront of providing cutting-edge skills to young college professionals and undergraduates keen to upskill and re-learn. In addition to this, India is also implementing a strategic approach to skills development, aimed for the digital era. There is an increased emphasis on strengthening core competencies in 21st-century skills, digital skills and robotics.

So, where does this leave the standard University education which is increasingly being challenged with the rise of MOOCs from EdTech platform? Besides, enterprises have taken it upon themselves to bridge the skill-gap with industry-created programs targeted at programmers and graduates to level up keeping in view the current industry’s demands.

In the face of this ever-changing digital transformation in India, is University education playing catch-up and churning out job-ready students who can compete on the global stage. Currently, India needs to implement skills development plan in a wide range of industry-demand topics such as artificial intelligence, Internet of Things, big data, robotics, material sciences, semiconductors, smart cities/societies, digital competencies, open source learning and intellectual property rights, et al.

How Indian Universities Fail In Providing In-demand 21st Century Skills

In this article, we list down how traditional University education differs from online schools which focus on imparting the digital skills of the future.

Outdated Curriculum

It is not just that the curricula isn’t keeping up with the skills required for the students to be job-ready, many institutes in India are still dependent on the old school, theory-based syllabus. This is one of the key reasons why the freshers’ status quo is a poor match given how certain job types and industries will soon be made redundant by automation.

Budding Technologies

At this moment we are at the pinnacle of technology, we are modelling, building and inventing things which are leaving the past generations spellbound. Technologies such as data analytics, machine learning, user experience designs and artificial intelligence, which were in the sci-fi relm a few decades ago, have actually become a part of our life. But they have not become a part of our education system yet.  

Lack of Mentorship

In an earlier article, we spoke about the need for mentorship and its role in shaping life and career of a new data scientist or a budding AI expert. In contrast, traditional university education focuses on merit, grades and bookish knowledge. More modern methods of education, on the other hand, focus on reasoning, questioning and analysing skills — which are much needed in the current scenario.

Transform, Don’t Reform

When there is change, there is only a slim chance where everyone can emerge as a winner. Many jobs in the IT sector are becoming obsolete and so are numerous technologies. On the brighter side, scores of career options are being created in the fields of big data, machine learning, artificial learning and other new tech, and the demand for people with the knowledge of these technologies is only increasing.

Students Lean Towards Byte-Sized Learning Models And Upskilling

For most learners, it is now possible to download smartphone apps that turn complicated subjects into games or squeeze zettabyte-sized lessons in everything from rhymes to coding into ten-minute talks or even quantify various non-curriculum activities as work-related training. Applications and programmes such as Coursera, Udemy, Lynda, Alison, Bright storm, Howcast, Code Academy, TED, Big think, Open education Consortium and Edx, among others, are the leading platforms for online learning.

These websites and programmes give a scope for a wide range of subjects, technologies and career options for the people who are having the zeal to learn through digital mode. While these technologies are effective tools to acquire specific skills, there’s another pressing and universal issue: The future is digital, and anyone whose skills are outmoded, will be left behind. In a survey conducted by Deloitte US, the results showcased that 75 percent of school-going children wanted to learn outside the classroom. They preferred smartphones, tablets and personal computers to learn, rather than traditional methods.

Enterprises Forge Partnerships With Edtech Companies

Large global organisations are already struggling with the lack of talented employees with the right types of skills to keep up with the competition. At the same time, digitisation is affecting traditional teaching system in ways so rapid and profound that it could be described as an extinction event. Daphne Koller, one of the founders of Coursera, stressed this point in one of her TED talk. She highlighted how Coursera gave chance to millions around the world a platform to learn technologies from their home and brought a great success in their lives and transformed their careers. These drastic changes in the educational system are hollowing out the middle class and creating a need that cannot be filled by our current systems of learning. Instead, we must work together to address them at a systemic level.

For the last 3 decades, India has been a land of engineers, with budding technical colleges and demand for the service sector. In total india has around 6,300 technical institutions which are approved by AICTE. But the question that most employers till face is: Are these engineers prepared for future challenges?

MOOCs allow the ease of customising education

The biggest advantage of online learning is that it gives us choices to pick a course that interests or impress us. Rather than confining to only one or two, this makes one to use their capabilities of understanding and decision making.

The post Online Education Vs Traditional Degrees: Students Armed With AI/ML Skills Have An Edge Over The Others appeared first on AIM.

]]>