• Professional Development
  • Medicine & Nursing
  • Arts & Crafts
  • Health & Wellbeing
  • Personal Development

149 Notebook courses

🔥 Limited Time Offer 🔥

Get a 10% discount on your first order when you use this promo code at checkout: MAY24BAN3X

DP-100T01 Designing and Implementing a Data Science Solution on Azure

By Nexus Human

Duration 4 Days 24 CPD hours This course is intended for This course is designed for data scientists with existing knowledge of Python and machine learning frameworks like Scikit-Learn, PyTorch, and Tensorflow, who want to build and operate machine learning solutions in the cloud. Overview Learn how to operate machine learning solutions at cloud scale using Azure Machine Learning. This course teaches you to leverage your existing knowledge of Python and machine learning to manage data ingestion and preparation, model training and deployment, and machine learning solution monitoring in Microsoft Azure. Learn how to operate machine learning solutions at cloud scale using Azure Machine Learning. This course teaches you to leverage your existing knowledge of Python and machine learning to manage data ingestion and preparation, model training and deployment, and machine learning solution monitoring with Azure Machine Learning and MLflow. Prerequisites Creating cloud resources in Microsoft Azure. Using Python to explore and visualize data. Training and validating machine learning models using common frameworks like Scikit-Learn, PyTorch, and TensorFlow. Working with containers AI-900T00: Microsoft Azure AI Fundamentals is recommended, or the equivalent experience. 1 - DESIGN A DATA INGESTION STRATEGY FOR MACHINE LEARNING PROJECTS * Identify your data source and format * Choose how to serve data to machine learning workflows * Design a data ingestion solution 2 - DESIGN A MACHINE LEARNING MODEL TRAINING SOLUTION * Identify machine learning tasks * Choose a service to train a machine learning model * Decide between compute options 3 - DESIGN A MODEL DEPLOYMENT SOLUTION * Understand how model will be consumed * Decide on real-time or batch deployment 4 - DESIGN A MACHINE LEARNING OPERATIONS SOLUTION * Explore an MLOps architecture * Design for monitoring * Design for retraining 5 - EXPLORE AZURE MACHINE LEARNING WORKSPACE RESOURCES AND ASSETS * Create an Azure Machine Learning workspace * Identify Azure Machine Learning resources * Identify Azure Machine Learning assets * Train models in the workspace 6 - EXPLORE DEVELOPER TOOLS FOR WORKSPACE INTERACTION * Explore the studio * Explore the Python SDK * Explore the CLI 7 - MAKE DATA AVAILABLE IN AZURE MACHINE LEARNING * Understand URIs * Create a datastore * Create a data asset 8 - WORK WITH COMPUTE TARGETS IN AZURE MACHINE LEARNING * Choose the appropriate compute target * Create and use a compute instance * Create and use a compute cluster 9 - WORK WITH ENVIRONMENTS IN AZURE MACHINE LEARNING * Understand environments * Explore and use curated environments * Create and use custom environments 10 - FIND THE BEST CLASSIFICATION MODEL WITH AUTOMATED MACHINE LEARNING * Preprocess data and configure featurization * Run an Automated Machine Learning experiment * Evaluate and compare models 11 - TRACK MODEL TRAINING IN JUPYTER NOTEBOOKS WITH MLFLOW * Configure MLflow for model tracking in notebooks * Train and track models in notebooks 12 - RUN A TRAINING SCRIPT AS A COMMAND JOB IN AZURE MACHINE LEARNING * Convert a notebook to a script * Run a script as a command job * Use parameters in a command job 13 - TRACK MODEL TRAINING WITH MLFLOW IN JOBS * Track metrics with MLflow * View metrics and evaluate models 14 - PERFORM HYPERPARAMETER TUNING WITH AZURE MACHINE LEARNING * Define a search space * Configure a sampling method * Configure early termination * Use a sweep job for hyperparameter tuning 15 - RUN PIPELINES IN AZURE MACHINE LEARNING * Create components * Create a pipeline * Run a pipeline job 16 - REGISTER AN MLFLOW MODEL IN AZURE MACHINE LEARNING * Log models with MLflow * Understand the MLflow model format * Register an MLflow model 17 - CREATE AND EXPLORE THE RESPONSIBLE AI DASHBOARD FOR A MODEL IN AZURE MACHINE LEARNING * Understand Responsible AI * Create the Responsible AI dashboard * Evaluate the Responsible AI dashboard 18 - DEPLOY A MODEL TO A MANAGED ONLINE ENDPOINT * Explore managed online endpoints * Deploy your MLflow model to a managed online endpoint * Deploy a model to a managed online endpoint * Test managed online endpoints 19 - DEPLOY A MODEL TO A BATCH ENDPOINT * Understand and create batch endpoints * Deploy your MLflow model to a batch endpoint * Deploy a custom model to a batch endpoint * Invoke and troubleshoot batch endpoints

DP-100T01 Designing and Implementing a Data Science Solution on Azure
Delivered Online5 days, May 28th, 13:00 + 2 more
£1785

Create a sustainable Sketchbook/Notebook from recycled Packing Paper

By Nina Vangerow

Join us for a fun and eco-friendly event where you can learn to create a sustainable Sketchbook/Notebook from recycled Packing Paper.

Create a sustainable Sketchbook/Notebook from recycled Packing Paper
Delivered In-Person
Dates arranged on request
£25

SC-200T00 Microsoft Security Operations Analyst

By Nexus Human

Duration 4 Days 24 CPD hours This course is intended for The Microsoft Security Operations Analyst collaborates with organizational stakeholders to secure information technology systems for the organization. Their goal is to reduce organizational risk by rapidly remediating active attacks in the environment, advising on improvements to threat protection practices, and referring violations of organizational policies to appropriate stakeholders. Responsibilities include threat management, monitoring, and response by using a variety of security solutions across their environment. The role primarily investigates, responds to, and hunts for threats using Microsoft Sentinel, Microsoft Defender for Cloud, Microsoft 365 Defender, and third-party security products. Since the Security Operations Analyst consumes the operational output of these tools, they are also a critical stakeholder in the configuration and deployment of these technologies. Learn how to investigate, respond to, and hunt for threats using Microsoft Sentinel, Microsoft Defender for Cloud, and Microsoft 365 Defender. In this course you will learn how to mitigate cyberthreats using these technologies. Specifically, you will configure and use Microsoft Sentinel as well as utilize Kusto Query Language (KQL) to perform detection, analysis, and reporting. The course was designed for people who work in a Security Operations job role and helps learners prepare for the exam SC-200: Microsoft Security Operations Analyst. Prerequisites Basic understanding of Microsoft 365 Fundamental understanding of Microsoft security, compliance, and identity products Intermediate understanding of Windows 10 Familiarity with Azure services, specifically Azure SQL Database and Azure Storage Familiarity with Azure virtual machines and virtual networking Basic understanding of scripting concepts. 1 - INTRODUCTION TO MICROSOFT 365 THREAT PROTECTION * Explore Extended Detection & Response (XDR) response use cases * Understand Microsoft Defender XDR in a Security Operations Center (SOC) * Explore Microsoft Security Graph * Investigate security incidents in Microsoft Defender XDR 2 - MITIGATE INCIDENTS USING MICROSOFT 365 DEFENDER * Use the Microsoft Defender portal * Manage incidents * Investigate incidents * Manage and investigate alerts * Manage automated investigations * Use the action center * Explore advanced hunting * Investigate Microsoft Entra sign-in logs * Understand Microsoft Secure Score * Analyze threat analytics * Analyze reports * Configure the Microsoft Defender portal 3 - PROTECT YOUR IDENTITIES WITH MICROSOFT ENTRA ID PROTECTION * Microsoft Entra ID Protection overview * Detect risks with Microsoft Entra ID Protection policies * Investigate and remediate risks detected by Microsoft Entra ID Protection 4 - REMEDIATE RISKS WITH MICROSOFT DEFENDER FOR OFFICE 365 * Automate, investigate, and remediate * Configure, protect, and detect * Simulate attacks 5 - SAFEGUARD YOUR ENVIRONMENT WITH MICROSOFT DEFENDER FOR IDENTITY * Configure Microsoft Defender for Identity sensors * Review compromised accounts or data * Integrate with other Microsoft tools 6 - SECURE YOUR CLOUD APPS AND SERVICES WITH MICROSOFT DEFENDER FOR CLOUD APPS * Understand the Defender for Cloud Apps Framework * Explore your cloud apps with Cloud Discovery * Protect your data and apps with Conditional Access App Control * Walk through discovery and access control with Microsoft Defender for Cloud Apps * Classify and protect sensitive information * Detect Threats 7 - RESPOND TO DATA LOSS PREVENTION ALERTS USING MICROSOFT 365 * Describe data loss prevention alerts * Investigate data loss prevention alerts in Microsoft Purview * Investigate data loss prevention alerts in Microsoft Defender for Cloud Apps 8 - MANAGE INSIDER RISK IN MICROSOFT PURVIEW * Insider risk management overview * Create and manage insider risk policies * Investigate insider risk alerts * Take action on insider risk alerts through cases * Manage insider risk management forensic evidence * Create insider risk management notice templates 9 - INVESTIGATE THREATS BY USING AUDIT FEATURES IN MICROSOFT DEFENDER XDR AND MICROSOFT PURVIEW STANDARD * Explore Microsoft Purview Audit solutions * Implement Microsoft Purview Audit (Standard) * Start recording activity in the Unified Audit Log * Search the Unified Audit Log (UAL) * Export, configure, and view audit log records * Use audit log searching to investigate common support issues 10 - INVESTIGATE THREATS USING AUDIT IN MICROSOFT DEFENDER XDR AND MICROSOFT PURVIEW (PREMIUM) * Explore Microsoft Purview Audit (Premium) * Implement Microsoft Purview Audit (Premium) * Manage audit log retention policies * Investigate compromised email accounts using Purview Audit (Premium) 11 - INVESTIGATE THREATS WITH CONTENT SEARCH IN MICROSOFT PURVIEW * Explore Microsoft Purview eDiscovery solutions * Create a content search * View the search results and statistics * Export the search results and search report * Configure search permissions filtering * Search for and delete email messages 12 - PROTECT AGAINST THREATS WITH MICROSOFT DEFENDER FOR ENDPOINT * Practice security administration * Hunt threats within your network 13 - DEPLOY THE MICROSOFT DEFENDER FOR ENDPOINT ENVIRONMENT * Create your environment * Understand operating systems compatibility and features * Onboard devices * Manage access * Create and manage roles for role-based access control * Configure device groups * Configure environment advanced features 14 - IMPLEMENT WINDOWS SECURITY ENHANCEMENTS WITH MICROSOFT DEFENDER FOR ENDPOINT * Understand attack surface reduction * Enable attack surface reduction rules 15 - PERFORM DEVICE INVESTIGATIONS IN MICROSOFT DEFENDER FOR ENDPOINT * Use the device inventory list * Investigate the device * Use behavioral blocking * Detect devices with device discovery 16 - PERFORM ACTIONS ON A DEVICE USING MICROSOFT DEFENDER FOR ENDPOINT * Explain device actions * Run Microsoft Defender antivirus scan on devices * Collect investigation package from devices * Initiate live response session 17 - PERFORM EVIDENCE AND ENTITIES INVESTIGATIONS USING MICROSOFT DEFENDER FOR ENDPOINT * Investigate a file * Investigate a user account * Investigate an IP address * Investigate a domain 18 - CONFIGURE AND MANAGE AUTOMATION USING MICROSOFT DEFENDER FOR ENDPOINT * Configure advanced features * Manage automation upload and folder settings * Configure automated investigation and remediation capabilities * Block at risk devices 19 - CONFIGURE FOR ALERTS AND DETECTIONS IN MICROSOFT DEFENDER FOR ENDPOINT * Configure advanced features * Configure alert notifications * Manage alert suppression * Manage indicators 20 - UTILIZE VULNERABILITY MANAGEMENT IN MICROSOFT DEFENDER FOR ENDPOINT * Understand vulnerability management * Explore vulnerabilities on your devices * Manage remediation 21 - PLAN FOR CLOUD WORKLOAD PROTECTIONS USING MICROSOFT DEFENDER FOR CLOUD * Explain Microsoft Defender for Cloud * Describe Microsoft Defender for Cloud workload protections * Enable Microsoft Defender for Cloud 22 - CONNECT AZURE ASSETS TO MICROSOFT DEFENDER FOR CLOUD * Explore and manage your resources with asset inventory * Configure auto provisioning * Manual log analytics agent provisioning 23 - CONNECT NON-AZURE RESOURCES TO MICROSOFT DEFENDER FOR CLOUD * Protect non-Azure resources * Connect non-Azure machines * Connect your AWS accounts * Connect your GCP accounts 24 - MANAGE YOUR CLOUD SECURITY POSTURE MANAGEMENT? * Explore Secure Score * Explore Recommendations * Measure and enforce regulatory compliance * Understand Workbooks 25 - EXPLAIN CLOUD WORKLOAD PROTECTIONS IN MICROSOFT DEFENDER FOR CLOUD * Understand Microsoft Defender for servers * Understand Microsoft Defender for App Service * Understand Microsoft Defender for Storage * Understand Microsoft Defender for SQL * Understand Microsoft Defender for open-source databases * Understand Microsoft Defender for Key Vault * Understand Microsoft Defender for Resource Manager * Understand Microsoft Defender for DNS * Understand Microsoft Defender for Containers * Understand Microsoft Defender additional protections 26 - REMEDIATE SECURITY ALERTS USING MICROSOFT DEFENDER FOR CLOUD * Understand security alerts * Remediate alerts and automate responses * Suppress alerts from Defender for Cloud * Generate threat intelligence reports * Respond to alerts from Azure resources 27 - CONSTRUCT KQL STATEMENTS FOR MICROSOFT SENTINEL * Understand the Kusto Query Language statement structure * Use the search operator * Use the where operator * Use the let statement * Use the extend operator * Use the order by operator * Use the project operators 28 - ANALYZE QUERY RESULTS USING KQL * Use the summarize operator * Use the summarize operator to filter results * Use the summarize operator to prepare data * Use the render operator to create visualizations 29 - BUILD MULTI-TABLE STATEMENTS USING KQL * Use the union operator * Use the join operator 30 - WORK WITH DATA IN MICROSOFT SENTINEL USING KUSTO QUERY LANGUAGE * Extract data from unstructured string fields * Extract data from structured string data * Integrate external data * Create parsers with functions 31 - INTRODUCTION TO MICROSOFT SENTINEL * What is Microsoft Sentinel? * How Microsoft Sentinel works * When to use Microsoft Sentinel 32 - CREATE AND MANAGE MICROSOFT SENTINEL WORKSPACES * Plan for the Microsoft Sentinel workspace * Create a Microsoft Sentinel workspace * Manage workspaces across tenants using Azure Lighthouse * Understand Microsoft Sentinel permissions and roles * Manage Microsoft Sentinel settings * Configure logs 33 - QUERY LOGS IN MICROSOFT SENTINEL * Query logs in the logs page * Understand Microsoft Sentinel tables * Understand common tables * Understand Microsoft Defender XDR tables 34 - USE WATCHLISTS IN MICROSOFT SENTINEL * Plan for watchlists * Create a watchlist * Manage watchlists 35 - UTILIZE THREAT INTELLIGENCE IN MICROSOFT SENTINEL * Define threat intelligence * Manage your threat indicators * View your threat indicators with KQL 36 - CONNECT DATA TO MICROSOFT SENTINEL USING DATA CONNECTORS * Ingest log data with data connectors * Understand data connector providers * View connected hosts 37 - CONNECT MICROSOFT SERVICES TO MICROSOFT SENTINEL * Plan for Microsoft services connectors * Connect the Microsoft Office 365 connector * Connect the Microsoft Entra connector * Connect the Microsoft Entra ID Protection connector * Connect the Azure Activity connector 38 - CONNECT MICROSOFT DEFENDER XDR TO MICROSOFT SENTINEL * Plan for Microsoft Defender XDR connectors * Connect the Microsoft Defender XDR connector * Connect Microsoft Defender for Cloud connector * Connect Microsoft Defender for IoT * Connect Microsoft Defender legacy connectors 39 - CONNECT WINDOWS HOSTS TO MICROSOFT SENTINEL * Plan for Windows hosts security events connector * Connect using the Windows Security Events via AMA Connector * Connect using the Security Events via Legacy Agent Connector * Collect Sysmon event logs 40 - CONNECT COMMON EVENT FORMAT LOGS TO MICROSOFT SENTINEL * Plan for Common Event Format connector * Connect your external solution using the Common Event Format connector 41 - CONNECT SYSLOG DATA SOURCES TO MICROSOFT SENTINEL * Plan for syslog data collection * Collect data from Linux-based sources using syslog * Configure the Data Collection Rule for Syslog Data Sources * Parse syslog data with KQL 42 - CONNECT THREAT INDICATORS TO MICROSOFT SENTINEL * Plan for threat intelligence connectors * Connect the threat intelligence TAXII connector * Connect the threat intelligence platforms connector * View your threat indicators with KQL 43 - THREAT DETECTION WITH MICROSOFT SENTINEL ANALYTICS * What is Microsoft Sentinel Analytics? * Types of analytics rules * Create an analytics rule from templates * Create an analytics rule from wizard * Manage analytics rules 44 - AUTOMATION IN MICROSOFT SENTINEL * Understand automation options * Create automation rules 45 - THREAT RESPONSE WITH MICROSOFT SENTINEL PLAYBOOKS * What are Microsoft Sentinel playbooks? * Trigger a playbook in real-time * Run playbooks on demand 46 - SECURITY INCIDENT MANAGEMENT IN MICROSOFT SENTINEL * Understand incidents * Incident evidence and entities * Incident management 47 - IDENTIFY THREATS WITH BEHAVIORAL ANALYTICS * Understand behavioral analytics * Explore entities * Display entity behavior information * Use Anomaly detection analytical rule templates 48 - DATA NORMALIZATION IN MICROSOFT SENTINEL * Understand data normalization * Use ASIM Parsers * Understand parameterized KQL functions * Create an ASIM Parser * Configure Azure Monitor Data Collection Rules 49 - QUERY, VISUALIZE, AND MONITOR DATA IN MICROSOFT SENTINEL * Monitor and visualize data * Query data using Kusto Query Language * Use default Microsoft Sentinel Workbooks * Create a new Microsoft Sentinel Workbook 50 - MANAGE CONTENT IN MICROSOFT SENTINEL * Use solutions from the content hub * Use repositories for deployment 51 - EXPLAIN THREAT HUNTING CONCEPTS IN MICROSOFT SENTINEL * Understand cybersecurity threat hunts * Develop a hypothesis * Explore MITRE ATT&CK 52 - THREAT HUNTING WITH MICROSOFT SENTINEL * Explore creation and management of threat-hunting queries * Save key findings with bookmarks * Observe threats over time with livestream 53 - USE SEARCH JOBS IN MICROSOFT SENTINEL * Hunt with a Search Job * Restore historical data 54 - HUNT FOR THREATS USING NOTEBOOKS IN MICROSOFT SENTINEL * Access Azure Sentinel data with external tools * Hunt with notebooks * Create a notebook * Explore notebook code

SC-200T00 Microsoft Security Operations Analyst
Delivered Online5 days, Jun 4th, 13:00 + 3 more
£2380

'Laurel' The Bespoke Handmade Luxury Leather Notebook / Diary Slim A5

5.0(32)

By Hands of Tym

Laurel is the essence of personalised ease and style. The perfect gift for the budding writer or professional in your life - maybe that’s you. Never miss a meeting or big idea with this refillable leather diary / notebook cover (A5) that fits easily into a bag or briefcase. Laurel features a contrasting adjustable strap with stud fastening to keep your work secure, and two internal card slots perfect for business meetings. Choose your preferred English hand bound paper insert: Plain or lined notebook, or week to view diary.  There are 9 leather colours to choose from, 2 leather colour straps and gold or silver hardware. Make it your own with bespoke personalisation. If you’re gifting Laurel, a personalised message can be embossed to the bottom right corner (as standard), but get in touch if you have other requests.(This is +£5) Size and DetailsSlim A5 (Quarto Medium) removable leather notebook or diary cover Paper size: 210mm x 135mm Thickness: 120 pages Cover size: 225mm x 145mm x 20mm 1x insert included Made to order and available in multiple colours with contrasting strap. Timeless designFront strap features the Hands of Tym logo. Made from respectfully-sourced Italian vegetable-tanned leather (biodegradable). Hand cut. Handmade. Hand finished. Other Sizes Available:Slim A4 (Folio Large)Slim A6 (Octavo Pocket)REFILL PAD available Personalisation & Unique stamp (not available for Courses or Tools & Supplies) What is personalisation? Personalisation is an embossed stamp of characters including letters, emojis etc. Please select from the options below if you would like to add to your product. What is a Unique Stamp? A unique stamp is a custom stamp made up of an image of your choice. It can be handprints, a drawing, handwriting etc. We can discuss this with you once you place your order. (This is +£45) HOW THIS WAS MADE Made from respectfully-sourced Italian vegetable-tanned leather, Laurel is free from harsh chemicals. Over time the luxury leather will develop a buttery soft patina. The paper inserts are hand-bound by experienced artisans in England from a quality British paper. Laurel has been thoughtfully designed and handmade by Hands of Tym, in Oxfordshire, England. Material and CareMaterialsHere at Hands of Tym, we are passionate about sourcing the highest quality sustainable materials. We use responsibly-sourced Italian vegetable-tanned leather. Find out more here.Using only the highest quality cuts, the leather will improve with use and age, developing a soft patina over time. Natural variations are a product feature of this material and contribute to the individual style of the product.The leather is free from harmful chemicals and biodegradable at the end of its life.Hand cut, handmade and hand finished in our workshop in Oxfordshire, England. Care Please note: There may be natural changes in the colour of the materials and they may stain easily. Please be careful with oils and inks. When not in use, store your product in the soft cloth bag (provided) in a cool and dry place. Repair When you buy from Hands of Tym, you’re getting so much more than your new purchase. You’re creating a connection with the maker - Georgie Tym. If your leather product becomes damaged (afterall, life happens), contact us to find out about repairing your item. WHAT'S INCLUDED IN THE PRICE? Laurel arrives complete with one insert and additional inserts can be purchased directly from us. Laurel arrives wrapped in tissue paper inside a black cotton dust bag, including a 'Made for you' card with care instructions and a unique code to identify your handmade leather notebook cover.

'Laurel' The Bespoke Handmade Luxury Leather Notebook / Diary Slim A5
Delivered Online On Demand
£109

DP-203T00 Data Engineering on Microsoft Azure

By Nexus Human

Duration 4 Days 24 CPD hours This course is intended for The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about data engineering and building analytical solutions using data platform technologies that exist on Microsoft Azure. The secondary audience for this course includes data analysts and data scientists who work with analytical solutions built on Microsoft Azure. In this course, the student will learn how to implement and manage data engineering workloads on Microsoft Azure, using Azure services such as Azure Synapse Analytics, Azure Data Lake Storage Gen2, Azure Stream Analytics, Azure Databricks, and others. The course focuses on common data engineering tasks such as orchestrating data transfer and transformation pipelines, working with data files in a data lake, creating and loading relational data warehouses, capturing and aggregating streams of real-time data, and tracking data assets and lineage. Prerequisites Successful students start this course with knowledge of cloud computing and core data concepts and professional experience with data solutions. AZ-900T00 Microsoft Azure Fundamentals DP-900T00 Microsoft Azure Data Fundamentals 1 - INTRODUCTION TO DATA ENGINEERING ON AZURE * What is data engineering * Important data engineering concepts * Data engineering in Microsoft Azure 2 - INTRODUCTION TO AZURE DATA LAKE STORAGE GEN2 * Understand Azure Data Lake Storage Gen2 * Enable Azure Data Lake Storage Gen2 in Azure Storage * Compare Azure Data Lake Store to Azure Blob storage * Understand the stages for processing big data * Use Azure Data Lake Storage Gen2 in data analytics workloads 3 - INTRODUCTION TO AZURE SYNAPSE ANALYTICS * What is Azure Synapse Analytics * How Azure Synapse Analytics works * When to use Azure Synapse Analytics 4 - USE AZURE SYNAPSE SERVERLESS SQL POOL TO QUERY FILES IN A DATA LAKE * Understand Azure Synapse serverless SQL pool capabilities and use cases * Query files using a serverless SQL pool * Create external database objects 5 - USE AZURE SYNAPSE SERVERLESS SQL POOLS TO TRANSFORM DATA IN A DATA LAKE * Transform data files with the CREATE EXTERNAL TABLE AS SELECT statement * Encapsulate data transformations in a stored procedure * Include a data transformation stored procedure in a pipeline 6 - CREATE A LAKE DATABASE IN AZURE SYNAPSE ANALYTICS * Understand lake database concepts * Explore database templates * Create a lake database * Use a lake database 7 - ANALYZE DATA WITH APACHE SPARK IN AZURE SYNAPSE ANALYTICS * Get to know Apache Spark * Use Spark in Azure Synapse Analytics * Analyze data with Spark * Visualize data with Spark 8 - TRANSFORM DATA WITH SPARK IN AZURE SYNAPSE ANALYTICS * Modify and save dataframes * Partition data files * Transform data with SQL 9 - USE DELTA LAKE IN AZURE SYNAPSE ANALYTICS * Understand Delta Lake * Create Delta Lake tables * Create catalog tables * Use Delta Lake with streaming data * Use Delta Lake in a SQL pool 10 - ANALYZE DATA IN A RELATIONAL DATA WAREHOUSE * Design a data warehouse schema * Create data warehouse tables * Load data warehouse tables * Query a data warehouse 11 - LOAD DATA INTO A RELATIONAL DATA WAREHOUSE * Load staging tables * Load dimension tables * Load time dimension tables * Load slowly changing dimensions * Load fact tables * Perform post load optimization 12 - BUILD A DATA PIPELINE IN AZURE SYNAPSE ANALYTICS * Understand pipelines in Azure Synapse Analytics * Create a pipeline in Azure Synapse Studio * Define data flows * Run a pipeline 13 - USE SPARK NOTEBOOKS IN AN AZURE SYNAPSE PIPELINE * Understand Synapse Notebooks and Pipelines * Use a Synapse notebook activity in a pipeline * Use parameters in a notebook 14 - PLAN HYBRID TRANSACTIONAL AND ANALYTICAL PROCESSING USING AZURE SYNAPSE ANALYTICS * Understand hybrid transactional and analytical processing patterns * Describe Azure Synapse Link 15 - IMPLEMENT AZURE SYNAPSE LINK WITH AZURE COSMOS DB * Enable Cosmos DB account to use Azure Synapse Link * Create an analytical store enabled container * Create a linked service for Cosmos DB * Query Cosmos DB data with Spark * Query Cosmos DB with Synapse SQL 16 - IMPLEMENT AZURE SYNAPSE LINK FOR SQL * What is Azure Synapse Link for SQL? * Configure Azure Synapse Link for Azure SQL Database * Configure Azure Synapse Link for SQL Server 2022 17 - GET STARTED WITH AZURE STREAM ANALYTICS * Understand data streams * Understand event processing * Understand window functions 18 - INGEST STREAMING DATA USING AZURE STREAM ANALYTICS AND AZURE SYNAPSE ANALYTICS * Stream ingestion scenarios * Configure inputs and outputs * Define a query to select, filter, and aggregate data * Run a job to ingest data 19 - VISUALIZE REAL-TIME DATA WITH AZURE STREAM ANALYTICS AND POWER BI * Use a Power BI output in Azure Stream Analytics * Create a query for real-time visualization * Create real-time data visualizations in Power BI 20 - INTRODUCTION TO MICROSOFT PURVIEW * What is Microsoft Purview? * How Microsoft Purview works * When to use Microsoft Purview 21 - INTEGRATE MICROSOFT PURVIEW AND AZURE SYNAPSE ANALYTICS * Catalog Azure Synapse Analytics data assets in Microsoft Purview * Connect Microsoft Purview to an Azure Synapse Analytics workspace * Search a Purview catalog in Synapse Studio * Track data lineage in pipelines 22 - EXPLORE AZURE DATABRICKS * Get started with Azure Databricks * Identify Azure Databricks workloads * Understand key concepts 23 - USE APACHE SPARK IN AZURE DATABRICKS * Get to know Spark * Create a Spark cluster * Use Spark in notebooks * Use Spark to work with data files * Visualize data 24 - RUN AZURE DATABRICKS NOTEBOOKS WITH AZURE DATA FACTORY * Understand Azure Databricks notebooks and pipelines * Create a linked service for Azure Databricks * Use a Notebook activity in a pipeline * Use parameters in a notebook ADDITIONAL COURSE DETAILS: Nexus Humans DP-203T00 Data Engineering on Microsoft Azure training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the DP-203T00 Data Engineering on Microsoft Azure course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.

DP-203T00 Data Engineering on Microsoft Azure
Delivered Online5 days, Jun 24th, 13:00 + 4 more
£2380

'Laurel' The Bespoke Handmade Luxury Leather Notebook / Diary Slim A6 pocket size

5.0(32)

By Hands of Tym

Laurel is the essence of personalised ease and style. The perfect gift for the budding writer or professional in your life - maybe that’s you. Never miss a meeting or big idea with this refillable leather diary / notebook cover (A6) that fits easily into a pocket or bag. Laurel features a contrasting adjustable strap with stud fastening to keep your work secure, and two internal card slots perfect for business meetings. Choose your preferred English hand bound paper insert: plain or lined notebook, or diary.  Laurel arrives complete with one insert and additional inserts can be purchased directly from us.  Made from respectfully-sourced Italian vegetable-tanned leather, Laurel is free from harsh chemicals. Over time the luxury leather will develop a buttery soft patina. Make it your own with bespoke personalisation. If you’re gifting Laurel, a personalised message can be embossed to the bottom right corner (as standard), but get in touch if you have other requests.Laurel has been thoughtfully designed and handmade by Hands of Tym, in Oxfordshire, England.     Size and Details Slim A6 (Octavo Pocket) removable leather notebook or diary cover Paper size: 140mm x 99mm Thickness: 160 pages Cover size: 155mm x 100mm x 25mm 1x insert includedMade to order and available in multiple colours with contrasting strap. Timeless designFront strap features the Hands of Tym logo. Made from respectfully-sourced Italian vegetable-tanned leather (biodegradable). Hand cut. Handmade. Hand finished. Other Sizes Available: Slim A4 click here (Folio Large) Slim A5 click here (Quarto Medium) REFILL PAD available to purchase Personalisation & Unique stamp  What is personalisation? Personalisation is an embossed stamp of characters including letters, emojis etc. Please select from the options below if you would like to add to your product.What is a Unique Stamp? A unique stamp is a custom stamp made up of an image of your choice. It can be handprints, a drawing, handwriting etc. We can discuss this with you once you place your order. (This is +£45) HOW THIS WAS MADE Made from respectfully-sourced Italian vegetable-tanned leather, Laurel is free from harsh chemicals. Over time the luxury leather will develop a buttery soft patina. The paper inserts are hand-bound by experienced artisans in England from a quality British paper. Laurel has been thoughtfully designed and handmade by Hands of Tym, in Oxfordshire, England. Material and Care Materials Here at Hands of Tym, we are passionate about sourcing the highest quality sustainable materials. We use responsibly-sourced Italian vegetable-tanned leather. Find out more here.Using only the highest quality cuts, the leather will improve with use and age, developing a soft patina over time. Natural variations are a product feature of this material and contribute to the individual style of the product.The leather is free from harmful chemicals and biodegradable at the end of its life.Hand cut, handmade and hand finished in our workshop in Oxfordshire, England. Care Please note: There may be natural changes in the colour of the materials and they may stain easily. Please be careful with oils and inks. When not in use, store your product in the soft cloth bag (provided) in a cool and dry place. RepairWhen you buy from Hands of Tym, you’re getting so much more than your new purchase. You’re creating a connection with the maker - Georgie Tym. If your leather product becomes damaged (afterall, life happens), contact us to find out about repairing your item. WHAT'S INCLUDED IN THE PRICE? Laurel arrives complete with one insert and additional inserts can be purchased directly from us. Laurel arrives wrapped in tissue paper inside a black cotton dust bag, including a 'Made for you' card with care instructions and a unique code to identify your handmade leather notebook cover.

'Laurel' The Bespoke Handmade Luxury Leather Notebook / Diary Slim A6 pocket size
Delivered Online On Demand
£89

DP-601T00 Implementing a Lakehouse with Microsoft Fabric

By Nexus Human

Duration 1 Days 6 CPD hours This course is intended for The primary audience for this course is data professionals who are familiar with data modeling, extraction, and analytics. It is designed for professionals who are interested in gaining knowledge about Lakehouse architecture, the Microsoft Fabric platform, and how to enable end-to-end analytics using these technologies. Job role: Data Analyst, Data Engineer, Data Scientist Overview Describe end-to-end analytics in Microsoft Fabric Describe core features and capabilities of lakehouses in Microsoft Fabric Create a lakehouse Ingest data into files and tables in a lakehouse Query lakehouse tables with SQL Configure Spark in a Microsoft Fabric workspace Identify suitable scenarios for Spark notebooks and Spark jobs Use Spark dataframes to analyze and transform data Use Spark SQL to query data in tables and views Visualize data in a Spark notebook Understand Delta Lake and delta tables in Microsoft Fabric Create and manage delta tables using Spark Use Spark to query and transform data in delta tables Use delta tables with Spark structured streaming Describe Dataflow (Gen2) capabilities in Microsoft Fabric Create Dataflow (Gen2) solutions to ingest and transform data Include a Dataflow (Gen2) in a pipeline This course is designed to build your foundational skills in data engineering on Microsoft Fabric, focusing on the Lakehouse concept. This course will explore the powerful capabilities of Apache Spark for distributed data processing and the essential techniques for efficient data management, versioning, and reliability by working with Delta Lake tables. This course will also explore data ingestion and orchestration using Dataflows Gen2 and Data Factory pipelines. This course includes a combination of lectures and hands-on exercises that will prepare you to work with lakehouses in Microsoft Fabric. INTRODUCTION TO END-TO-END ANALYTICS USING MICROSOFT FABRIC * Explore end-to-end analytics with Microsoft Fabric * Data teams and Microsoft Fabric * Enable and use Microsoft Fabric * Knowledge Check GET STARTED WITH LAKEHOUSES IN MICROSOFT FABRIC * Explore the Microsoft Fabric Lakehouse * Work with Microsoft Fabric Lakehouses * Exercise - Create and ingest data with a Microsoft Fabric Lakehouse USE APACHE SPARK IN MICROSOFT FABRIC * Prepare to use Apache Spark * Run Spark code * Work with data in a Spark dataframe * Work with data using Spark SQL * Visualize data in a Spark notebook * Exercise - Analyze data with Apache Spark WORK WITH DELTA LAKE TABLES IN MICROSOFT FABRIC * Understand Delta Lake * Create delta tables * Work with delta tables in Spark * Use delta tables with streaming data * Exercise - Use delta tables in Apache Spark INGEST DATA WITH DATAFLOWS GEN2 IN MICROSOFT FABRIC * Understand Dataflows (Gen2) in Microsoft Fabric * Explore Dataflows (Gen2) in Microsoft Fabric * Integrate Dataflows (Gen2) and Pipelines in Microsoft Fabric * Exercise - Create and use a Dataflow (Gen2) in Microsoft Fabric

DP-601T00 Implementing a Lakehouse with Microsoft Fabric
Delivered OnlineTwo days, Aug 26th, 13:00 + 2 more
£595

Leather Working Course

4.9(61)

By Jack Raven Bushcraft Ltd

Run from our indoor workshop in Tenterden, Kent, we’ve put this one-day leather working course together for anyone with an interest in leather, craft, or who simply wants to explore another avenue of bushcraft. You don’t need any previous experience with leather working as you will learn the core skills and techniques on the course.  It’s a perfect introduction to making traditional leather items. So if you’re looking for an unforgettable day, join our experienced craftsman, Paul Bradley, in a leather working day that focuses on leather work tools, design and construction. We have designed this course around the making of a simple notebook holder.  We have chosen this as a project as it covers all the basic skills required to produce almost any small leather goods.  There is also large scope for customisation of your project. Along the way you will learn about different leathers, techniques and tools. Typically, you will, under the expert guidance and instruction of Paul Bradley: - Use the provided pattern to produce a leather cover for a standard 5.5 x 3.5 notebook - Be guided on how to design and draw out and customise a pattern - Learn all about the different tools used in leather working and how to use them - Learn how to cut out, assemble and mark for stitching - Learn and practice the techniques of the traditional saddle stitch to complete the assembly of your notebook - Learn how to finish your project with edge burnishing techniques and other small but important details that add to the overall professional look. A really great chance to begin a lifetime of making beautiful things! About us and the venue: Jack Raven Bushcraft was founded in 2011 and we've been delivering bushcraft, craft and foraging courses since then from our 30 acre private ancient woodland on the Kent Downs in an area of outstanding natural beauty. This course is held in a purpose built indoor workshop. What's included in the price? All tools, equipment and materials are provided, as well as a free notebook to take away with you. Tea and coffee provided. Cancellation policy Strict - Cancellation and a full refund can be obtained up to 8 weeks before the course starts

Leather Working Course
Delivered In-Person in Ashford6 hours, Sept 8th, 09:00 + 1 more
£105

Beginners' Guide to Practical Quantum Computing with IBM Qiskit

By Packt

This course is intended for beginner-level individuals who are fascinated about quantum computing and want to learn more about it. It uses Jupyter notebook and IBM Qiskit tool to execute your learning into the actual computation.

Beginners' Guide to Practical Quantum Computing with IBM Qiskit
Delivered Online On Demand
£80.99

Embroidery workshop at Big Penny Social

By cheekyhandmades

Join me at the Big Penny Social for a special evening of hand stitching. I will take you through the very cool technique of chickenscratch embroidery, which is done on gingham fabric with normnal embroidery threads. Simple techniques can be combined in endless ways and colours to adorn clothes, homewares, notebook covers, all kinds.

Embroidery workshop  at Big Penny Social
Delivered In-Person
Dates arranged on request
£25

Educators matching "Notebook"

Show all 9
Mel Parks

mel parks

East Grinstead

My Work Research Researching the stories we tell ourselves and other people; both historically and the present day – I delve into stories in academic research, the media, memoir, diaries, fiction, art as well as myths and fairy tales to understand the dominant narrative. Write Writing my own stories. I don’t ask other people to do anything I haven’t done myself, so my research includes autoethnographic or creative work of my own. Writing myself into my work is integral to understanding and shifting the narrative. I also work in creative collaboration with others. Facilitate Helping others tell their stories. My aim is to encourage diversity, complexity and specificity. There are no quick fixes, tidy endings, or moments of complete resolution in life and the stories we write and create will ideally represent this. Stories don’t need to be straightforward narratives, so I offer tips and techniques and make space for blends of different types of writing (eg poetry, lyric essays, journals or reflective writing) or other understandings of stories such as craft or visual representations. Curate Sharing stories. I aim to do all that I can to help little-heard stories shake the hearts and change the minds of policy makers and other people who keep the dominant narrative going. I do this by commissioning, editing and publishing stories of community and co-production on Ideas Hub; organising events; speaking at conferences; creating online archives; and writing and publishing articles about my work.