@chrisb@mcla Thank you both for the math app files, these look good! Preferably I would want the option to draw both linear lines and hyperboles, so if you know of a way to do this let me know. For now, I will try to adjust these math apps a bit to fit my needs, but I am not that skilled yet at Mathapp, so if you have any tips, please let me know
@jmtrik I have been completely unsuccessful with loading js from external sources..
For example I am trying to load the following small bits of code.... In either the main question window, or if needed, in the HTML Response Type. It seems the external external source is not loading?
<!-- I had to break the tags below as the page will not allow html code with a script tag, in other words pI would need to add the needed opening and closing script tags on both of the external sources below--> script src="https://d3js.org/d3.v3.min.js script src="https://mauriciopoppe.github.io/function-plot/js/function-plot.js"
Hi @jmtrik, Thank you for your interest in this question! I know that you have been working on this for the other types of MapleTA questions. In MathApp questions we seem to have a good opportunity to sort of 'communicate' via the startup code with the student and have his/her input automatically guided in the direction of the correct solution. The correction of syntax of an answer given into e.g. a TextArea box is one first important issue. Another is the use of 'forbidden' operators in the answer box, i.e. like int(x^3, x) when asked to hand in the value of the integral etc. A third issue is the possibility of communicating adaptive guidance towards the correct answer. All within the same single (possibly even randomized) question. I have uploaded two MathApps to the Maple Cloud as illustrations. You will see from the startup codes, that the syntax check there (by try/catch) is still somewhat ad hoc, and that in fact this particular check cannot catch the error stemming from the missing comma in e.g. [[8,9] [4,5]]. Moreover, some error messages are not particularly informative to the student. An example is obtained when you miss a parenthesis in e.g. [[8, 9], [4,5] .Other error messages are quite informative, for example the one you get when you write int(x, x = ...). I know that you can, of course, just count the parentheses etc. and communicate errors of such type easily to the student, but it would be very nice to have a general descriptive 'unfold' of all possible errors like for the int(x, x= ...). Do you know if such a list exists? And if it exists how can then the try/catch machine be used to show the 'unfolding' of a given error? I enclose here links to the two mentioned examples in the Maple Cloud (direct uploads of *.mw to the present platform seems to be impossible from this end): Input_Check_01 and Input_Check_02 , Best, Steen
We solved: it is sufficient to remove the lines relative to "privacy", "automodule" and "exportFrom" from the XML file, because they misguide the TA 10 interpreter, while being not necessary for the correct interpretation of the contents. Thanks anyway!
1) The hard way: Design the question using HTML response areas. This is something that can definitely work, however my knowledge of the required HTML is too limited to throw together right away for you.
2) The easy way: use question chaining to construct a seperate question for each of the question parts (notice: you could then display all 4or5 questions on the same page of an assignment so they look like a single question with 4or5 parts). To accomplish this use the MapleGraded response type, such as... for question 4....
question1:=evalb(($response.MyAssignmentQ1.1.1)=($answerNumber1)); question2:=evalb(($response.MyAssignmentQ2.2.1)=($answerNumber2)); question3:=evalb(($response.MyAssignmentQ3.3.1)=($answerNumber3)); question4:=evalb(($RESPONSE)=($answerNumber4)); question1 and question2 and question3 and question4;
This will require students get full credit on all 3-prior answers as well as question4, in order to mark question 4 correct.
Thank-you for the response. I am not wanting to call the variables into Maple until desired (in other words not all together at the start).
The reasoning is User A might only have 3 defined answers to the particular question while user B might have 7 defined answers. I was hoping the for loop would gradually gather all the answers rather than having them defined at the start of the call to maple.
A more thorough search through the Maple TA Online Help has shed some more light on this question. Including the command randomize(): in Maple-based variable definitions sets the initial state of the random number generator using a number based on the system clock instead of the default seed in Maple.
Thanks for looking into this! Unfortunately we do not (yet) have a local server for MapleTA, but this security issue could be an argument for becoming self-hosted. One way around the concrete problem (without self-hosting, I think) is to embed the needed (if not all) *.mla procedures directly via manual copy-paste into the startup code from the worksheet that defines the *.mla file. But this, of course, is much more cumbersome than the wished-for simple one-line reference to the *.mla file itself.
Your answer, however, then also induces a similar question concerning the use of repository files inside MapleTA itself, as thoroughly explained in: [https://mapletacommunity.com/topic/64/how-to-create-and-use-a-maple-repository-in-maple-ta]. Admittedly, I did not check this out yet, but the question is, if this functionality also has been depreciated or blocked in the meantime?
Yes, of course, thank you very much! My mistake was to think, that everything could be 'driven' from the startup code. I see now that this is neither possible nor in fact necessary when using TA both to set up the variables for the MathApp and to give the precise and individual feedback.
Ok, thank you. The possibility of allowing the student to navigate through other sites or programs while doing the exam is in discussion since it can be considered not compliant with standards concerning safe exams procedures, but this option can be taken into account.
Yes, I'm using the 2016 version. It worked, thank you! I used $x*$response.provachain1.1.1 in the response area, where $x=4; was defined as the only line of the algorithm. Good to know also that it is wrong to consider the (addable by the user) field "id" from "Information Fields" as equivalent to the "id" appearing in the source, which is not the case.