For Client: Go to the library and select my company questions. Then click the + button at the bottom right to create questions.
For Freelancer/Creator: Go to the account and select avatar on the top right and click the Content Dashboard.
- Then click the + button at the bottom right to create questions.
- In the Create New problem dialogue box, fill in the problem name, choose the level of difficulty, and select Problem Type as Data Science.
- In the Problem Statement page, specify:
- Problem Name: The problem name should be short and topical. It should not give any hints to solve the problem.
- Expected solving time (in minutes): The time required to solve the problem, should be an integer between 1 to 999.
- Description: A good problem statement should comprise a clear and detailed problem description, at least one sample output.
- Difficulty level: Set the difficulty level for your Question by associating the tags as Easy, Medium, or Hard.
- Scoring: You can leave it default or change it accordingly with a penalty mark for the wrong submission.
- Maximum re-submission allowed: We can restrict the number of submissions of the solution, leaving it 0 for no limit.
- Execution time limit: It's to stop the execution of the code after a given time, leave it blank to use default limits.
- Allowed programming languages: Select the allowed languages to answer this particular problem or blank empty for allow all languages.
- Discovery tags, Insight tags: Tags are words or phrases that help with the searchability and organization of your questions. You can add the existing tags or create new tags.
- Primary Technology: Select a primary technology that best represents your problem.
- Datasets: Upload datasets that are required for training models, and for evaluation. These datasets should be in .zip files up to 20 MBin size each.
- Training Dataset: Dataset that candidates use to train their models. When the training dataset .zip file is uploaded, it unzips the file and stores it in the location “ /data/training/ ”. For example, if the training dataset .zip file has file1.csv, file2.csv, file3.csv.Then the location to access these files is:
- /data/training/file1.csv
- /data/training/file2.csv
- /data/training/file3.csv
- Evaluation Dataset: Dataset that candidates use to predict the outcome. The target variable will not be provided in the test dataset. An evaluation dataset is used when you submit the problem.
- Name: test.csv
- Location: /data/test/test.csv
- Validation Dataset: Format that candidates should follow to create their submission file. A validation Dataset is used when running the problem.
- Name: sample_submission.csv
- Location: /data/test/test.csv
- Training Dataset: Dataset that candidates use to train their models. When the training dataset .zip file is uploaded, it unzips the file and stores it in the location “ /data/training/ ”. For example, if the training dataset .zip file has file1.csv, file2.csv, file3.csv.Then the location to access these files is:
- Stubs: In the Stub section, click + ADD NEW to generate the code stub and specify the technology (programming language), and optionally add necessary imports and code, then click SAVE to generate the stub.
- Sample solutions: Click on +Add New, select the technology (programming language), and add a working solution to the problem.
- Testcases:In the test cases section of the displayed page, perform the following operations:
- Click +Add New and Specify the name of the test case.
- Select the programming language for your test case code.
- Write your test case code for evaluating the solution.
- Enter test case weightage for the test cases. If the code passes a particular test case, the score is assigned. Test case score = (test case weight/total weight)*total score.
- Optional: Select the Is sample test case? check the box if you want the test case to be a sample test case.
- Click +ADD TESTCASE.
- Attachments: Add any extra file required for test cases. Use the file link for the file path.
- Click on Save Problem to save the question.
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article